var/home/core/zuul-output/0000755000175000017500000000000015157267070014537 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157273606015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000305264315157273522020274 0ustar corecoreRwikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf e?KYI_翪|mvſFެxۻf+ovpZj3C4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) *ɚL}ӄ]C }I4Vv@%٘e#dc0Fn 촂iHSr`岮X7̝4?qKf, # qe䧤 ss]QzH.ad!rJBi`V +|i}}THW{y|*/BP3m3A- ZPmN^iL[NrrݝE)~QGGAj^3}wy/{47[q)&c(޸0"$5ڪҾη*t:%?vEmO5tqÜ3Cyu '~qlN?}|nLFR6f8yWxYd ;K44|CK4UQviYDZh$#*)e\W$IAT;s0Gp}=9ڠedۜ+EaH#QtDV:?7#w4r_۾8ZJ%PgS!][5ߜQZ݇~- MR9z_Z;57xh|_/CWuU%v[_((G yMi@'3Pmz8~Y >hl%}Р`sMC77Aztԝp ,}Nptt%q6& ND lM;ָPZGa(X(2*91n,50/mx'})')SĔv}S%xhRe)a@r AF' ]J)ӨbqMWNjʵ2PK-guZZg !M)a(!H/?R?Q~}% ;]/ľv%T&hoP~(*טj=dߛ_SRzSa™:']*}EXɧM<@:jʨΨrPE%NT&1H>g":ͨ ҄v`tYoTq&OzcP_k(PJ'ήYXFgGہħkIM*򸆔l=q VJީ#b8&RgX2qBMoN w1ђZGd m 2P/Ɛ!" aGd;0RZ+ 9O5KiPc7CDG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NETScjYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=dC' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c'~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGtGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.O8ϵq\FOXƀf qbTLhlw?8p@{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟ^T;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>fXmpLJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟCz%lOONRѦmDVmxюݏX}K6"Qi32\-V_kR(I-wtSJR^m{d a|y,F9$^@mdH֙toN1 < ҷBq/ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { wml"Ms>\΋"?|NKfֱn !ڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁV?md+F/-i,Iw7ئmζ9X#֍^I?$;vdSmܸ)OpH )dM`18w=P"0l~ ,UIy+ͷ< U*D)E\i,2}S RD%d7{k=*Fˠ)zuެue'<(_jsDUaj/DȷĩL{k)*Un)nj |54H4fWuA|BL R㹱ODLOtьqQM^"S"_Le?"Os$W S.)]R6btGlѮ,#meFF ^zli,lÍ+ݗŕcV$]n獀zن"f,B=2 ?ڱ6Lt ߵC& 5sHXwfZ6 }e0x\z9vޫ5se>|tU2{˗\XS!~P FM6< ;< rDX?I7 ^uDZ,ľd oZ/HsM?&/H'3"a^~ⷃ_=+c0- F,iq6L*nl7Qqo (,Cke Nj֊2 g>1"uylia3+#),d̲aX{mKҍ<ߌB^3?&y4o3 ܮ tLGI&[SIdY 2J^`<2O! olB^X Z#l@[§#M[q*5"Ag @l$dLZWjT𒧠X娚M#k0< 2*`D F,`"I/+S떞͹r;s|c]$9jFydL{d2Ov1w~Iʼn!m} ?ug\4|neShʑ +q̧qXjLXx~?:{>W/>( \]d f}>$'^zs 7o~܀z֗W "in?-uO58@1dU<էR.% Ͷ~/8x&Aqv~8nxr^C.Yp"<59p%a_x>3YJM {5]yraTv~ſL `1uF]sPi2ҰۛR!bϓOyZp IgyKӓo螂{@l)<4m-.8H[Mte kPp>>GkL}N,axr|D0:,ٞGX8}=z8۫*Ìn>"Ҡ"P?$OLG 45z}c-ibyg1 ]QbI긟m4јp1za&Ak)$eI@::=O Fu)N!P,i9Woax Χ"`2Ibo/wu.px)3M^sgIݺwx,mrXNޞ wUcCjC: o{췼I6gh&I78xDD$EY&=ts89{o?* mI b)Yor A^R\qx )I2`hwi`<מt7>wH`+~ !y_CO_O, AI~;R]TedCI1-T}!?ȫOiԽ, m]<"cLtќ',;u0,)sj;l[I|q^$gTurV9.'?!OiHy']sIrO%&軇iW$|Z!}{7i}9P3Wu^jI_I5Y#I:Fwӽt-؈lwt>U ˳2,6d l9j>G:hjĊY;j_ MH GBFIC"}z TwXzΆÛ$ꛛRaNxR  nJ%Ϫ{睜MǺDm{;Ll(  ssoà iukhZ_$|6ƺXo5f̶|rir?˼HV)"_M#7#u=+  ] Ow"2:S&uwƂ.jVCPg뉓79V-ɰ^$цo:J]:7f|`k^ mUA,YEZW!.8d9D-{CU8IC7Үᵿ__;Dt::d=%.eK$"Z4L:˴$yTrRx{V33s~ ޡ}R"1䙮JMg՞_f']3qawm4 ֫ MM B <- rW[@1^}0\4I2@+b磻Yܪ4"8 bm|{)^'Uw˛ ļ XDhgU!IWZ\ʡ][R !UCWPu^I>F"=̵2fNC s<59pa()!3oc0Ÿ!_Z6E4#os֝drZ1m=c+6VYO6Dz ?X@v rbw]<#yc6KQ%Pg`xzgvfd;:e;f~_w hpʴt%:DO .u8dϰ4J`6 l8 ۍPzP ,r ko*~m^Kh<Qo9#pϾ@}n{3[LJL q?/ lqP4?:`WphOօ{'h2۳zQ >qEPg5~>2 ޡ?;q{q= Ϯ8Acsxq5tny&v9Ҋ2ɆY\M_h̓(<0Wz?`~?NTG@#FZPY6ZT@Oos50iOVAךa" wvIh5v:> urC sPcêkzcE#Pѫ*~?\B3N)&ƾѳmф. #tB wJ.7gFνsBGPv}{ ^E@bw~#~E$ zBp,{~oVKH[Kc8_9ϙh~E^?"`&W:iok =<~^Y[WCϱ4]]{x=htT`}:Զi?~5cMwSwprc-uTvf)p<4dSI-EBI^bv \Q,ZnS ՙ;:0TN?±=˲,JdeyhrA1Y\e$W\y%Bd :.0 eT5ޒw ]:ueԺ +GX$1r 3n]#݀wTGttc}A;<3ee"G!̲r|D~Z(cFdee Ш{^lx(fduGAR.4a@%z5z*qxS-<sI&#Q=\gÉrHfuRL0;sD`C>WżCf.*K Qt7).5m]F:,E2'N0=0kgYh(Ĩy`SE<=8*;tu̡(oX*3 }\SR rtv]1h9&23,o4g$Nut(IʮKւ!UFu +i+'(W˳'ߟwkd^fb{ɼ6GN왾(2a]ӘI1mˤ2eUi'7Ccd鈎M:0^&QKtRt-?N Wn=q"(#L|5a1+25 ߭\2۵eFNOe6gـ}s/V]º01a [9`juk-Sh vVpa-z,-ޫ:&,5:[@ՓAe)h/kw0ʫE ƎDWB ZZIҢTLMxcq𚾆z2C( Ms\ ćJ iTZ* ,E26t ) G"(nʡ^`pSrYx^\}R{X2PpƲ q4F}~c9YIJ"%s*kk  0vJ,[= ӁM&W,XQDӋ+dD#k#o`33mt#7!XmU\W4}h(CGnrGv6^k^dS,,Si VwӰl6kvM" lLR,Q7%umtqip"l~ (`B}PjtAai4i|e <|omAO((Fе&۴)ݡQ[ڬtv]oحZv}jܸ۴ݖBX{{Yj(N[Pg Au&l A vm!BPw{Aݧ (^[Po A&mA-[ 4^i; l!h4Bp{Aç (hAЕ\|ٸ@"0DjnyVB2Y\"׌:{yL{yAsb:FG䋫b ɧpO,?WYge^ yG&p,Q7A}| ][Zĩiub>Y"~&ăuCstG- ?1><I/Kr&)FfɂbPh"ih |#2q`Hd\JA5ɓl@ *A'#@JƣO#20;ϒIOC_04^}2}VhՂ -"}vF!HR &6Cl<@oG"8?[ B֞F!'VzaX!UzX\Q,ZR`45C|/XAmWH x!Rs aPG`8v/Q/spK>M|T<4X!j檭? cZm#? gǞAz6vJ~?(. H”^(bހ8>w0`So;#JlȂ3St-9Rmt#uAQ潲 Hv}#\m0෨.\x!P g "BGzA- tw+- 5Z AlS3iE҉`UÂ*Hrޑ)e~ tӠ^TE@G߾];ӲD=xbe5SqkR@6BE#4EɝG=WюJB#~]"861p,la\=Q͑ qNONz4eԈB@?O1ڷBIV7a2e]o j$+k#1//~deڄF8 ٻ)9yHBu|YoÂRx@y_APuiU惦aH1>_"ԥ~ g΅*#B4׾IC<|>oWRV1$HࣖRHMs)N\ \h*d.VX s #~z~H08=le͍9+bFc0(F ɢRis1b\VYQuZ WCOz)T]fˬ|ܲqry\R7=~/czL[B^Du3'9)$Z٧T+-@#@4d?4>c %}]!m"hMU#FUidAZ|_jnzh?^lor6Ý[s,UxAwgXpkR!4$ۛ 82srrq15e@Y]JDm.sIOXTscOѐ'&7XWZ8-ͤQ$/ѲาΟ-,8:*tCP \0ȋ"`x3cɌXWU[\<Id?h"tk)Yu<}1T ݊#E Ɨ΂S1L_f GHwM*" {j\H|c#  Q*Bj䔫Id) 8B3K3nUvɨpw2 ջE~|X+仮l#K5aѐK:bPR$|yeQUHmXp EJx\ :HzsU∳I|.*X}\=~J%c1&eeD3@lD#G7{t)UA\\.ws^R@yf; ᥌B79c]:NEQaB7pS/QRg tY[Ν :BGje/zv/_I#4:F[Bel_|#&̺_@$M2ŢGRpBɍ7'c _>Hjw+ѱHn<ڹhMJ1o8ZZ+J n#MEg-k"ncUv6x\'L&U>*1:8LpH,XgZj6=J46h"+]4e"HsnO2\;7[МR"W#D^jw,8ƸZ%,HPs/.D|}rP0xȺ2I %KMw{KY1"G!x䙤\NhVc0JR:4s 8 Bt}/ +Mٮ sZG!VU~ԄtR-H6)@$I<&g11 Jtd}'3)NGMޱ8ٌ(w|Itr܁[aUwDnj3brgѹˣ ̊gm'ی$!di@vƚ`>dI[GwQ^hAvrc.­jT5CiZYxn9w¢Ggiɹsw.&(kCVmq]]ٕEm0zEUIH4:ѩOIpm7׿amhB}xASE6 F/-Gbfo J*vbPF'W, " NR$Ϟ:GJ$zfqqIZzcIAPɂNJT*{?:<8ήih 9bqid \+ YN o-ZHɿlߡ\q8zrׁ)79yh2IACP#™Qv[lG={qkNy ='h Uچ;5z*^qm2d:^;uznf/lVabYY#@3 t&&?\5 \ǂտbsK(=]8FŶݶ'$3tx9G%&t:0ZwL D98G!J!a{ uh][0~$gAHIcHHMxk&ϔ]F3e%nH3Ek]BDŽ:SNooeqr;ر]aU$V@B+J|̗w.?A2Ex3\qaUERz?iˬcуƯ=x1Hi> s_7ú(z^A`g9J"KgOO.qr2sR#Mr 5~N^1y@u F#*9*Ö2_Vw$o|ݷCT|b8o-~;?WBYpNKaIP~o w7M:v{ܱ=YgyVMz29Ŧ6J['w ð/lsAXaĂ 0z5+QY93^Pb!oPH1F+ɵɜ+~U|䉝ɭI|<+pT11̎E҆|VksQ>=>/h'c'''gj=shF7yWH#!~2oܹ;%^C~rs qKkWKf1ص ;7n euRv5JyG A&֋4B7&T hI;!`JrѬ`M5(1 N0*l2?yLE1rI! -Ah)Co2OJt~|{'%y}ID5`=?qfb~"r#O<:h{N>L 1<ۛhͨq,$礎FbϔnasٮY0]W8ymQ( e6F&Rd65l6%漡b6sG@ eĎkXp4qʒuSMѢFJ9IyF?XU8e)DCF*%.CN UzVY8ՔKʏ,8NW뢲XA}D"399zE&e{ ;x;0 |ypeBR&'rH:~ow1sRda3}akJ]"I41\ns1\YiUIQ~_1_j\oI5{tX/ICeZiEH:yo?= z]鈣FR6-n57,9gnr{ ܍}ny6R o/澳83[O? dJ)cZ{8_d~7k&~HP$CQzf()$58#mYvל>U]]Dz<lz4,p/k[G\W$d,Qzj $LW'QiʯM9{Twq~)a'O~rQ ف7gX/ 4~E{ĔB+<$W@_FB^i)bi&0 !pO5\76/ߍ^ggl⛉#{ui'adz!Gc1 , uLFӑ N{o>Ik5åϣ ;OIRGbzF]2J,b jN>gf©)dtidV8g0ͭ9Eẉ |#ckC0b@Uiw<&dW~m\O~~k?>9O1+>f0S.<>aOY-^՗hޚPP؝w*ùbφW~O^!@'I0@M.JϭH /_I޻NI'3FUh G?6U6=}.k4X+I`$S;N:ƈ+zL(+?Ay[m7ATn1W`2 /M,rɛjJolp7B \%Z H×ޫgX:ryB{,S{ѮqS# 'Uv6dtUؗp1J+(+ђTg}BY]W}՛ fs5Wqf1'޸ę)(}▬"DE"'a,pSykq^]0qΗ[fMy@RҀ뚼^9^/fc:D+ .Xlu LN~|}Z=< $W .Yzxww֠i߬~VD~ "yaAr!nrѶ꒧ YAc{[f4څa0]̻;6(iTI)c nEKg3Q{܆d}yxO0eK~_JR'D-" s+j4_dMgO`L|,e)#h U6 16e]Vt//Nq4b7+p,>i!tՠE*@x(nw6܀wisɠ,a|81ZԺ.ȪN΋&KN:囒R?RCNybC]]au`͟b~W7cU) H=ZYE3{c/aY>]Hz ~"bKʼn#~WqesdG߾k {81Ƌ&["Q=Ad<\듙 nS,P)Q9V9eGZ^5kN<^=(^aBuD`-N28uaňjֵ|G }I,Y38JJu|`Mւ}XOnrX?S;`O;Ջ&L A֨}[-/#dW#T!z懪e8lEdpX[sH3zzCa) xmy ?,Y+B.h<?e\G݋m4|]3j,Al}<W8*ٞ+X*o#a&Y _F77@ȵ$*Bc+L!>]3EoK(Z1+[bqq ,m3PU?>&-#UW.J_ϣ87?}wRRUw')*o"fzt|5}(k נ锯*o_ 3@mdӬJ# aHŞ;\|FQ$[^[j &P57 w+h+M cq)'bX郗qDp `mk$`@XBK_/ib{ |=`64]NC$PǏjq lrpG^U"3ߖZ;a|W[QDlv#q #TvY1~DW_N`!J{o%F,@ݜ'TJb⪖y- j~2`?fPƼb5!PO;ǪE\p]?) {m%gV61I]RInJ)$Xh. ^ǧ߈檖O-N кi~]_5 Y0Vw'H?ꥲ! B_&޾>]GY|9>uӠ7ǎylYakCs68,hɟU)%ksIٛƊ'P`(4EgWa &b'ܶ6mes)c Ҕk$J lAƈBdK"~WwAv01+AJI:&8ݢOџ$ɺ MmBYǕ!XD0P*eJ#K8֍X@n0o O("^a$2`p$iʩ8#iJRHtFz*$Y+ P N &X~ 5T25F Ÿ[|G1#;3l1HsI;e? w3Z.Pgqbc^R(^qlKʗ:͆ 8\j7xFXqx^\B{YZ-_cXݚG ׵瓛w.0w4g՚^1q F~ _K w=P(#w}U/(9zVlYYYP >N{V,K.K -Iuj+Y|6Tj=VC7T -6ݯBf2D^_n ŽN-sXxkCj#b̙a1*vVL9U(FL%J`cn'.9hi5c$O%2Oa+0EXTSVP|,U"~1~*F!PLju0x4ŚRY>ЇЄ2+5HKOfjSpH k渂qOp0f0ze#&F&@b??# qD*8CgS*EB 0 *pN )NF1(ױۤ āഩMOAibjzSE=VH9tbjB|06| ^F$rxTٸ\n=SpKוІ[c`OBϕ! !EXiC޵q$e6VU  $lduH-)95_ % WjQb$Ba׶*eJ ܉*=**0W~peVhc@FJ^{eIkvѻ.,jj%ΤНJpO١'N:&Gt]'LWns:|ԌFGL۝GY>i|@F,9tZ 0=cJy&P&?l*kj~qsům^ў+>u^T)# YJU Bܲ*cYWp)R҅r`!)ΰ$֖ط5$đǧ; ;60ZsKqMҊ PʜլVYEYV2tCKQ2SJ(%l.OSrM*2ED.djꂡ Ǻӹ.H8}Cj:&_:%g}Duݲ#E(!k`e V.Y5KAFF hahw q&QRKs.½`(G*ft^e/4\RyL&%v2³O%io:Ojj ' ҔD`ZסP8kiͪJ6<>O2 [Y2|W8|l܎/q$)/ 4}?N_/^ܵpsg!OeyVKzm΋k#@4wMvE:4"ʰxu2 @oѰr575ɗjmg}{ٌZ{<')"V GYV14k '?JK3 ø~ȣ`&G }]o{<;Cw\lp?=dg}$.=h7w9Y[1 J3!??Xk.μym>@*MM*P~/?qT]osK(.kO!tGG:rG|:qӫ!fIt8.Oߊ}!1<;P;.BO|'ɸn)Q؍`G~Fߓ~ҼOJI_ߨ^\+j)XD§:9.YVc4(+:{ a~2;9yϴ_&:_l*-3=~9C5[}"sCCcoݒ,2 Q9j.ĽU&NQ{seyw^+&듭&˯Gb07'$+o{ODNQQQG{]8dru>o|Ӌc\7pϚ_E;:5X#˷ut,3ͯ-)Ώ} =_M=pޔHjk-P-;@#j|)tr}&<M ɣD_*jZ/7ϟ?~ SL| ^uں_{pSuj}<PIzѬwv\8Y ^e ^_^i~v_߯MgW?Jn],7TI24K|x??_EU}~o{_&_ʢQϛB5˯G0J{\JvT8Bl{h?zF{o;Ẉo[?GdAܸNOBi~jrrhuLggPh'kE\/f2.%oI_*ɬ{?xȫ9b{1'^=ѽQ9~y!koI-WW{+a\ɬoQ\xz ɞXEqeu@0 C9kubZEm8u8~u8ʠ%O|2h-r0OmZd.SL/*_3KY"BwDŘ(  M|p]9~[GvE*: 2S`W[WF𡠵ELjzp7PC˹dM5vYϱzS{ѩϙܨ@:lRW#=:8%muf4}?jIU%؜_z%ɗEnwEg9燃V6E.kol%7@^q'\!z7#`^4dY7?iݼ>ho>ZwkˏW8)/'ёl9_b )j2DŽ0JrD Z7wC )eC- S7ܿ V~xSsH7 =jPg²jU .ǠkrV2h%yωT<llg =BKRۀ8O&[q_C m qG"'4RǧK":vc9aJx cf9J4dŔvΤ>SEp}넍'tL{``jpy\bZ[ԞQ 5Nƻ])?i,,9Z< *SXW(0rѻ5nG g5je dMW ۟8bxyU $/aU{5'8xRTn`}Zu)uC)4WD]-ZwLDp՛ |D~k#$=otffN͜Q}B,niJhf8]_zy),$T{U&hQZ;Fɞ0}'zQݸhQzny6Siz}!#Twb@{XfI$TR52JÌ)r;Eцzpki.Zq_ZB-g 0.z+y`"ႈ Gr]{W RCɄ7oٴK@^FaΈyn62Q/ěKV~ۖ$ZĿmI,edj"\K @"ب) }O BY%x8AN b%^#nEO/45J6|e-FڈTv*3 5Kp~߼}ęZdR5=;^xH^6ȧG^ӟhEUT k,+*\T"eR\TBVp]^jtWe,|ۈ;v݁}(C*gV^VtZsnUzyn|#9.DžH0 mR%Ur)vꏻu:|<,\҇U Z' Zp6| wjpIIi.j'srVIy,J|ͤΪ-\>'u$7v}AH"^ႍ Ul-xh}]`-Pm'm)vE%UbhyUvl7 hPFC`@ ip Vjk{p7PV9ns/@Rڞ(z$JZw\H.$ TvoD6r7\x m@-ʕ.A9Xe T6yKWKS㾄 Mfe}XPH)a p#j5 R-Jƭu[ܺ7pEr Lt*``4zᖽ`Q ndpᦨBЧA dӐ@||j!?P@ Ȕ_26~4 yw@Og h :l%95s\սԀLX{zUK}3 )nLa77ܑ"("@.I}@UjiuC& ',Pk_2k6ڻW@ږ_oI= }#g\)x/~?㌒7uη̹~,"J{Pw! DjNQG~,i2;8iVIr* c4NXK'CO%E8ZmϾ珏tm1h'Zp(6c'o47k|Nd  Q` 8bY暳6T<6cK*UEӺ_)%*F~~='崊ӗE:O ļ>BDY@(w ozލfI2Pץɶ4}@}x'?CmLp/3/3ˌ ,4?Xx9p}8%T/z(~ˣv< գܒ7[%齆-i BQQQ>8.P;'t<_Lg3 tYtr=$=kԍқK}Qbƀh5_G_7$GY͟]~ % aKއK?\"=S1IYoIrǿ =d) c_Ε`Hl̦(vwY̡>2D*z˼b;.d^O?t?y7ϭ?tļb!dP IV?lPpdkXVўP*oU:Jo?0N3tirKg] a[o&NhHKOgHKT"ӯ0 _~(9w&HnueOv?[3@fkW찋~HeԟK@H#kxX~əْ %C^X#@b=X@n&/eK,s|Ͻq9;1sb~zt3x ?5f:hӜW-銉z2 sM>lP.CR[&N`ο>v$ug2RY;jg:QW8/ =7}fHÃ.^6DM08q#e#HRV#ЙH/IA  錒Үr\0R 1#49B1}i%12#dub\0R%: #kSO w=OF[|Xn65Ԩl2͛FQcBB*$T;l3%5|QԞsUT(JuTkyJf)9?'S z+4v!LD.̉T'E3EꯜC%nou{Oi: 5JTinou{O~Zi՗#S r==#čH/4Qu{{nx@TAQ|RE6V4ZtVɔP+>12Q5 ZF^yuF/Uv[z d$SO[z Ct|DDQVd^3Y$, UuDn)hN ׫^ ̉4 $RT=]W${L ZT݄ͯ_jH3y$8,_L5W'^)YI" 18nou{O9Kħ1k/UCV$^2 N$?Nnk>m?2q_߾, m]^zQw.tt<]ߒ=Sč  Cga_%V ՇO߻=QϟcO^|[ s5mwtjf~uwses{^ zy-"H`m?4#QF 0JiF҆C7{l\2=웝k6q!cK3e"ȯ ([g4M,Ҧm|xAo@- چNdJTqj' CKel!!T[-7隣S96vS HP%H9Ta+*Bf>;G>ưѦwۤgl8;V!isWfZmJz>> / `̫9Z6R !z?C߷{C/mXE$kb "}/B@Nn&..mhDǞǦؔ⺌(VP0&&Fjєբr@% POm4!LeE5=-jl+1P ]缚 {3\6CNC(>Y[4R Rŭ"eքU[ayF.;8(rƾYFሷ8#m\W䎖a:7 678~rjYF>ߒvtG뵦k.]̆Ef4K( &l"-M6NM$bH9T_aw 9RʜhP5AYB^4R)UXӊ/]WӊkݦƷ?"p˔,,("" y=Ep{ysCu֯vn }ÐGb+]hzi4QFw=x3ÈG7 =+}ϴ]mdUF1LXӣQBG.=)= !MAjgbH1Tj u t2KK;a`(9_6R7jetH)H{zG2dbH90==2 9E6tb,D5bNSu)[sk4+Zy{BB9e*H9,U,,Y%>qYi|U 9,Y6R Kd%⠲;6.ksI--ed8]L 2a8C:L0vuxVyYs֑w1f *4%AX,KFʑyRr`<\:L`J <'C;Îb}͈3M5: ǸH@dsWAn ݷ'4&w6_C\G^?4w}ws1wf%~x? W{}wV!'N&mF⁂h #;SOD^OxB_dfm7Z|{f>̱1i?Qs-i]:r(^vȄق_ f?" q-(Fe/jA J-(1iX5X2F Q[3p0M^ ]Z_AfIK]oЏZhs"#zΧp[cGgLZ͢W^6R iNT{ߧEkYlXswJu,GgH9'Tb[2Բkb8X,&8j8͍tY X6R s_A@؄ OFCU=rc~Z.+e]n9Vcڏ~D>"FʁBw(>y+D˞5@mV}{N0R("bq.f88r PI>j"' aPcG !PAq`𩿹_͠,RSȨGfl*pkhT1iɏTok̻DL۱0˓ #B;%R]@qx%OipșaPS³]2v 㝸oaumSe2b(68ĸr#V*#蚣,&{7<x0fH9liwMeLc2T"73'>g@HA3#rrh+rSU"\Mq}iNJ&aLe'"0RșchCpp"v8:oaϱo .rtelR9SIzLaJ^4C|N1ߗ` 1 hm#cFCSu)H9V,uUq檄 U~ Q);t,qKD_kς%א4a`O"54H:H9VGgX%3x֔4{4LԨD#GFth X vMyj{nyR$ zkq|1 ʐ;mDځ\?A`Bs ^faeF9T!OpTЂ稒ͿJ(cx4k< =g)G|ظצf tvj(yhQA$JrlA|*GAN Ny!CT/zn?qǞgYC j^  `3#g^6J9_GU8rNU(-9R^6RB7e,dyo}].vZ;=lS+U'ޕ6nlٿB!R0=,{yA<`FZQr Vh--R жEci!{o-`%}a3jRl׳tv_A߄Ka?P~_a4J-:?L;ag=7׻ֲX-OϿ.zŋWf1>k xt͢u㗏 @(soeyU #l ' V<MG[$A|Kd;>\y3_!낚cHt@&z#SX|v%`f N0v#(#O@ #u?\#vpسte+J8]܃9L?59AWb#&YEg0bҝԝT%(,<X*-W۟TOH(Fg@(LL5Ng BL)M(J, P`uF}vek3|ˇlf{OvUgwi "U(^ Tx.?4n㑽`8+k?OzWM> &},TxrPd_!o T 2&Ob39S kj Gh#9ȇBŚ+rF&<<ɧ([G>K>}Za]d|0DJǔaG1-1g4W*T7Jڍv(%) }#%!Z*~^ An$I\Vud)4]-{g0HRdl'c%C9<Ň`^eI\` ތFJ_iѿd(Ŵ䞓'7t<6?y7 gy?OgnUMI[ ~oO}ޠ3T^$@HG[LѼUY\$&MEq^9WnQ~0u/GGwJy*`طP6p XbkD>@hX\2ia&{u L-A~v:sh:// +3ߘEl $ ?>璘FQaR ÷N@)'F@u3 J R#lS *o3BZnVBm6CAMt[pә7xg}(703:IH J^΍Ii9d ʱ-˛yyO)_{xt42Q0e|TH>G0$k Ui/lVj]Z(6j1;]J8u1-!4!d(`OH pI!MD+#yUM`1Ug1]b:X#[ ~e1b',$g Ub2Y uy%) Kb*1f&tXt1ql ^@Z!g1dlZִ\RK#IնLmXbٮLmW!ժLmVcx=/zD+hfVM"ΨΨ jZDe7b R_U#LIҺhX=ncr^qb2⻞rzuз^%ZpU3( dV0yJiȜhk$ǣ֔*F=+FmCPM^n,ILՋ@ nln.Qfom>&҇מRh.]Ռ{?_K e=6B;.쁇 {߻IOpaEepU<ګu2cpÜȤcf(C9 CYpR¥w)S\>yzD(zF+ljڥST/TTX}ihe׭gШЪ29&&ҡCٸxu*W6gMbH J`h/|Nb!G A8?>+AjjѽMPSDUK Ln]4i\s0VVz5>[@o?Lg,J$ {.|ӋЪozT%صrf@3?lLi% +^䋏"Ɓ}_d:n/F֜GA@3(k><42&V\TA#q[ٮ>h5>x]"A1`2{ m9b%4 :3nqh4z{4M%w|:aAEk?-8wKh)ДŨN&78%[ bh\*$(]$/csq><?zw7- ĻI©܂eh'B\ʮ3W|>[b*>4'ӬzY!cdq/oTgf V OfKAj)~WBsIR0~e%]0%ʏ&ߎ@, Yg` uyZd`lnQ 0=d|?hݠ7&w&kBW%v 8!.O>sQLōyt0os3(`PM  $_wUIg=+:^"0tаB؛ +o~seЯnj!Ww_Ǣnw^|1v4vR͊MRcwX N7^buͻ ZJ"(Mbn5_KUWߨk'Rb=|B6*r.\aG#̲>MB8kGf7l*s J܌k(h#o sx3Eg>Ntv07.v+CT GUc3;bj&(J=NVL(ņ2 o.?%$m-ҽ]+3P r4q%\f i WY*7(ՆS⤢" N ׊ Rޯ\ jRTb0!6RdFŒiXGIFFY&!V9 8F}k`2*sL 9pΉI "[pCLR,S̙w6l d/ZFD1h{PJ;%: 1c(24sdh -VQ)kvz}KYƹu)Bfa&ߓ{9MAIq/3ʴ6 Eݍ5|'84f,t7̸lCSK̥fYlj I>;EHs})2v S%]mLRH G Dx"4͠a:E>A R롢Uqj\)YIä2 nT?Ag=BXK- X=r\#͎$k3{7cc"#%XNXY<ԩB)Vx ?"'RZ&BͽphUKӌ 5>c9!ذ (l\!2SCJ :}NVɇVcu^c c5Fh!lM|jB2>RKN AH G}Ce)V) : "i^ОA㢥RŐFa 6deHx$&T J-FrL/B¸`H0Z#!#o7C I!g^~axUP4҂tX9_ b 8=(:A[H)h-h 4|W (3,_".^d,LRl3pmh[ZtXhQ>(e:zf9Фd>\FC۟@xEa$Ti@e,!tOW8"ND42(B P Vո\ɼ *9vXJUȨYfiA_holFe+6B3}yո$d}JMpQ)aO]taDF :t= F'fVm~h|~9;?;L,_y[d1Yn)a<\=_)U7$, ?q uRP'F5/^E0-B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B"yBn$?* x:Ѽ+[J)P(ԑDouPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuPuP u0;>&(#ԁBP*:RXNNNNNNNNNNNNNNNNNNNNNN΋jvwNތNf{~ARv[rr+IyEO_0,mhf3QT'gCEj57HyWaǒ VKAxjC ڎZZ=XUټ-Y#1GHأ`==/W*Sj\:+9ƑyA&?p.v1.i[\~|VyeRXgrp+^:;\'d1 w<^u\v /3fiS"Fcۻ~sJlⲡse6կ1vVE[酨z?;C~<͇a6>Vma*/|z9M1;Ѳ[g;?O0oiGM0,̍|nC<n< SHn0NqjmLi;9_R/ޯ=3FgLit@,58HotpaOx4lóHqw˃('tf(qoKkG% g;|$߼ồ-IۚTs_ gUe],wu L}m٧}l=6jIUO]ceU09lV0"o- ?xGYIdb+w/u2A$[ lK&UuGu99;H)fKac|]; !38.[ = qAhd(lt Sdߧ9UYc)&/}N\A/j_U+^|vH7oW]-P|^oZvZntZ2>HJp{wn1כtZn1FDV>W|f>cmO4 @y:Wk0Wc4FR_^`k[\y{O x! pa7uތݭ#cEz}0vE::o^7m`iniq}٭0\ 9^>l[m=g2j6hs8+L0+͹]>q*/!}C.֜ۿ-c{7ɧgam˛׍ ͛ppgvE*;ݸxOn8_v`_o+_MF|֍ lu V:]Ot6Jzymƚ){=Wnz}6kFn=nia.`ݾ| W!itQ㐷|#_wvނ۸{?u8cړ_a+/0sx}K</zM/7|sB0RXxe){8{/]Of,~F~@69P.fCxAݤOG@pʌoöZiyLAF2B7<,Osb8}][/ׯ R `wv_Fb[n %Pw eC/I&cߖLe 9rW=iVtvOK%U<"e;~bV'b9`Nɾ7ӥc`i.)5)pcߠ_up&m1aN5@\+^ U[k4cr q_kofWAƝcsӌ)kKmB H0.6OB= ɸXvnߟ1Ykf܄ߛ77CNt{k?ތ,ҏA7fg>Hm Ur\T)_9>u*V~.?mxy˿]Ay䫍ޟY%qₛJP*_' K\Rj VƆ]~yhF9UnɁb)Nt ru~4IQmlnxz֊'MNhvu/|_gL&U*F=f2eiU0ފRt]]pytxW.@`ē \:Jp*'QFjMovPojlޫoXIb$BA+a}c;;]~YMVk0.+|{Յ<6pA#$1>(dC,Vt0Onr2G/ͺXzM]Lҕ S _)Weܺf+'w񆓻]~5:wyxwy::QŚ)Y." jdunfnt^Gp?Kn7e>ejcG?8KN/Fg+/ϻTH5e3g&0fy=XK.v.O U~1ykR`vы+#k{mlT#<{sr&ƃAⶓfp_4mo2vv/,݉oNv3tǫ`ҔD )` dxJ~o3D]ΜiX.~ `0[ܷ:Ͽ>_}1xXq"Md XJ R` ݁8Ct.=hwEw>zO'oۜƗvP˕DscowFP}21}/w&痌M(w3屮qBT;_i25gj!|`CxLut$WsZ$OE.0;>]t{Q80©d:C(NDFIޒuX&fPGegS6r8^r\լUޖ`` 0zijԱjq`9G8,ׂZWxKPf+d%˾P^h]') |,̈́E#Tޒa$'t:inU8%a uwCy1cK5oy2yޒ-\+'T'Iq14x`B+,ds H ,F9$ Ǡ0>VV*ԆD'Vԧ! 淕9V;F j(uKe;*Kyv1+QcY<}XԌ<$cJoIL3+ P#Y+lР4EIY~`ꉭAV4$S֤5 |$_ E֫[ejcԼ~w*lI3A%(:wL`)+Q#߈*nD/qMLy_($:5,+ 5JW6Lj'im@D.a+ \M1k8c 9NƊ%MD \oQkm`̡5Q}&t('4|n&5p>VLWdkd(hL Mkmj4Ah}SQ=gfBcүs/ջOxN)!Ձ G%m)Q1Ngbt@deG?`o 7JVC-ޭ)8XJ)o$+}LŎ{?O@#֝~PՀ+Cdյ[]K2D*tQ& AW^20CXbMVC`uΛVu4P*IR-&gYHzS[&i!ptPcYn\y$:?Y`jbgLIh>\qV(7؄]~^sO$T\OۻEڼRO/XV/x3@,`<jF!YXviE"XO]M }JRUՂɝZKUrBsvTSR{Hvrgi͸ NŞ1:|Ł䫒S$֟(Kђ򂻒j ǡJW0Xmoh@wPꄱlKkfQ>AD(8Rs&'W*? }Zj:"5U'߸'?A5#8 t/B|4ljqVMY{bZz(dTV 6֠ۜm]~PÀu)Qp5grXW(8jՄ5cX QySYAVL[8zAZ/(|Zz2pX j=e~COhw$ Ώz;ʆ=bn~}M2q`}ο {5pcVo> (CoC໮AaNrD(,Å%Q.q}몊zO JZ|5aJIGaAe-U j R`UFaO+v%Y'xc<| Z#]qRWвD]*p<+v{mK1nK1X,??6rAMA>ZԉyRx=VCjr(,7̖uqoo>vԋc_։WՉ IGSuj. }d&ส[5R*(U8J˹Ҝ+0QP1{hlA;ʉ<_;QS!g)}Ej\O`^s6\ :6./͟Җ_ji܄;/o盛0~FMWt䛞?|v :&% Sc&3n-YHUoj@~dr,6t..vsbşb4N FB )ׄ χY_n ڬܲ囋='eZ yRZ<=W<{uuv4f w Y1g{ F哌K׻xnqiun <3.~WqD>\P]X quu 9xh+8(Q'X񂹵P VJ-5WE0.(HťQpOĺ4KYr)|,y47iM7~TӴMSxJ|*Q(vNTX:F% U˩ufN&^pTji/xy N -&nj=kb&%Pn8`;/#`M"xҬnҀEH~dl+^0#UҀ1Q'bWD0@Օwɘ;@Q#xQWϾ 3q^SOCnֵm 3pRtp38Nc>>Zk|"+]%#L'ubn >\q}}'L牢։!VI WkP6Ow9dp_[ۑmCN%N0x~{Z7ozܼVUdr /a3-]Y"l=d}~ n>d65>,҆xx}'\,EҾ[w~}y?Q^VNn94'Ϡ+p}?w0&5Hiu  ʑ ?)/?"ۛ-_?E/~S_c'8ߢ9߿d_f?>~ߎi6ys=_nj-] N%C_˗I/ jWRwlωǯSO{| $ez ߁Zg |kϕJ1.p>EZ+Lm%ӵb9kmU ύ6TԼ}dNC~eyiS?oK*>}>af;/2l^/'_iRqlʺJ=^,fs.Lf磁[Ic_먿w7֙o;eruV6ffY$^o3Þ=#pxwJ%.o[$Ň)jӜF!Xhϸb@0כCsv3_ƒ__{#fsҴz|UT_p:ͬފ[z\.Ek`f{EE{LzɅ@$Ζͼ(~%2um./7(Ǯ!n2gfA^^Q#瓭S7(=LPsJ~Qt/t}3_L7N-8EIJuG?wN[gKNSz&VnBFq @V6"Z؍1-j[<$di0|=Yx;0׉4]襭-GmBUtt0F,7Td/cD!'We^vY!'_n6)JDE˫vFMmVY(;%z?*drMI :Pb%h'=4Z#ptYrͧ-nq WQ]ɎZ^2I>Λ7Tq읶Txؕ$H- `-eT3+YQcYk"I=َHr"Xt~X{<1H2URIjIU݆ݑ$$Q JI]`0S f$#qep(e_ M#ge4rl< $fHA!|UAPhuUo:}/,ޫD%8|]*yVeE"[LRhIXm)#$q馋 H:p 8s"{ *r:p ؙs g*ڠp 7 5`'ZJ!?Bdp Ж|zY Y݁c$,Ճ%H8' Hnh'܊2YV A=E 4>whH~~W$ǀ_Kd?M&dS4͜@2{y?8'ڛgzqQ}IXW%Ը H5rxMPA&wZ@hp{:_α9\|67 \>& ckU8],RQJJD.!UQ )BVk(+Ix>ݡQ98$H%%Q]lD_oJCrg޹ǀ]D">/e5@`;p VH.J-,O(d")G+hUvbm'%mdȎiKb݄*CzE⏳hQ Q >ۄJi5οcY36E}[ߧ}{m"/z]wd>?/T(ǀa*nܭHnW7rxy998|:8ǷqH|5E\XR'S%DӼ0/hb8rF~M+|#^8#NI*f]C\vr5ԆwDwq5S۽;shrY *2pV$Klpb{S.#k҂ig6?[FL܊UT:KXHI2]4%.o:z'^Cm8Y1p^3Gi>m'BGZ łQq@sXRi*R6b؈jz}h;ˤɧPo aV ų >72/)Nq=/MR[ⴓ ㎃(.v#hfb2oAJIFHQ<h*D.\&d>Ӹ렲]t v@b5)ā+?/|2uZ_pvW@}& hT\Mx\ C;nN'jz17 za RQR%!KRu$BWaޟohYCo$_ 1XYo(2DxĬAAGc?oP;?zc1=0L\Lv2Sq ĆUuwBnP-&4(K.2V`V'̺ O\2/` f&dH%d>§+ 0x+} lf!§>(T;ݳ>BBqgޗn|h|r]=(NPnIWT}]xPhz|6\wu@G9.wP)_\}-8b;9^Dh\~\owE|u}F/U3\}TF/R3QJ8s"5-Ӂ* \ kMxtٕ˭d!$Ie@>A瑗^ì82LgLG +_ 6ݾU=PCW[:.f.PWzJD]v؛O$RS!+-YKf^,bAh0'u~nYJsR,4$[afb=(r1ߐxY!&ާ#2a6`(whõ&d 5 `Zo\Wn&úR(.Q" 㳻w<030O梒|>Ko7$9zG=HE܇C ;2抰 <"r\ HGo ֗5Nd`GB1t LohǷ GmzO|UM ħ9.,3%dK\uQ 31vDw"+ʘ> >^Cg>>5tvz|}pY1MP/r!Dg[o1fF q_p022?s|[zhvRף̵ߋ젗S( ˩ .kcuGmIa1 bՈ"SJJU\f`Vl14Ou*_ TǠv_!õg&޸7&r/_>KrOw=Y R%tFǷч26]3J-N6G@lWbxC6h܀@ eԱ=-.Մd2/W'{:E/Wթ}Z𒄊s2aJ; f◦"VlL Sxe `F*i|^4YKU- MBx-imT{T{ p焞eM LZ1mKZpqtjkm_S@ڍ6zᨼ%ҮE$d^y"hD;-"ůq6McÎ|Sb++X⧍5~5?GOv}B~2 UBKu3>a^ƨ.8$Ǭ"9JgH\sv=d}YŲr}dZMTĨ5FT$JZS2ۣyuTpWknGwޫz] [2oVBYmYWniyE[u~ƛ?"nyE]e7rO|1~whyMaQ**\II4I.Jk 鼆OɖEZDǁ_It͈Lh7A8$Ǭ"9:+r0&VBiR k jI`D,wn2_&cnr>,Mވ|UEe\eI>5 WLL8QZw{[v!k؅ ,fLQ ,o֨E~PS3.)6)W&=Qc.7kIn?{o0kwZ%p>N /~!t  Ku'#T w~Q}O=Щd?ܚtd}N}+^ =6l82R #.ןgb6GZ>S1rE?8YCc0i:O't6Z2RjI<>"'Suߊ`V^G҂?%LjE`/0~<.0{/z0@lYEHX`x{RDB(uj߽aqu& D ZSohVZGʴ(K>qSj/F'+~\{qmǷчj78]W+X))"+(^q RQnߢSŰ/PЂx^|܎cxU\7ҾpYo YSSJqjf|-/.;VU 3p䅉ôǸ#pV2%:(Gƒk|ܠ~Cw7ƍQf߳ᩭ$.‹K:p| sޏa巪3 Sw>k#svTY#AK&͆ /soGN8$)mn^Bl@EA.iS6$|'\)hqB~lH *y`~7_wcD#b{lpW^=:o3iѩ¤\y{G81@gI.R~L8gWQ"(ҬM nS*\JC:pC`wqE˶W@F㣍*.InĥeY.o0']xeVь tXZ LSEA>A]]0& 04 No5^$5t]6)RYbޙs !ՆgYRS|6{`YcrB]vi_k^Y(yUAb&%'ێn!p5c0];2qjpFvGf 0й9KÕ uiBz'Y=g׸8>}gb1竭/L[C*~+9sgg .giK3N hpJpt"QV[t/ 5$_4ǥsbhgn1JeiE!)=;%VУTQT*iI ϱ(֦G`klz pgm]Nq↝.HxiH>KIK!cJ鼼aVl40ClEzKYLpN0c9 _0ό!>ɸ=S ˏmp#$h=[|&g3CGb&X5F/ߺ?ӱ+(]vSR opV#8z=^I+u-zрܱwh+יAM7ά#f4]~o5|ܭ E勇ǜn{]GSE20\/ `%OV=ko8r/Ip% do[vrfAIvwOlp=Erw]j1}XeU*V1bϯUwLj<ktTJ{Pd9ôUnByuxӑWd{ʁG`z]Jx\2_28A x lO8j\L{㋇ymMj@B21uaEZ50 H!fAY׭JK`41JzDgF2Jҁ#,qDB TA ~(t?G`ٴe{BT4R#ɔNoW@ B1„tdJƹ8Y\~!^>cνg1p6KKW%>l:oT{lzL>OHф8pt`eN:p&NJ . T?3&٬N̦ TmKJi!TE)@Ժ0MJ_O QQ|PbdXI'@˂R@‰p\:cUܷvP\dLG6N)\pukjJtLzF7 F\ z,m~ùx`s꒛U=Bjz..*H|U" p2k5@ҚMxEb.YfLԿ-^,d[Ip"ҁ#0q8\=R?Dr Q7tE KB_L[)MݼߛNv&8QZԼg ]eֺ̒a`(޴\,1"d2pmVpjhBP:p>UQ:tLmxʥ;^>0I$ JmbaHyqG:oY(\Ȍ,O L Qx5DW67Л vfcT6DҺO iP¿(dV"QD+\`%"%LK %Z?WT%pbY  oCU L4x=Q;w=FctpNߣ>w_<z)L3;wݵ.z$]|I >_qMreGc#ܦ TфؗDOn{By(81mq>IG[)GEL:yt]1;p&2ܛA 6SElTDQϹ1L57[65 xdU,m0K2:))YxA\=@/Ko\8cwVSİQT"P7`x 8\[5۾U3nI2Tv+ BVx`Fš]˧)N8jB̬qPzFdZ$u'o$c  Ho1}*D߭txSH?{ LڵE*.JRɥN3ҁ#0q(xYu B>g%Q=*#08p]imtn||-hf؏bcգ189`gU鐍O.*2_(.~8#8 jJ߈s?Z0\Gvk7^j 71vDzc/B%:$t$;桀$TLES#f4y=az m(}( p#;1SŋLMurc? 瓂74#S;(Z & E?gdANbh+Xqmf5 VOxϾ]a52ʹߝAڃ;qGs l XB:jv\aHMjO b]-m0] n?ˇROUeKJDf5.+r8 jOQc=q5)to'Ϩw |ݴY΀jˁOH-$$H4ip䟋'XD&Djzߥ]=SQXRT߅D߇02y?9ZnԿU[ZUɵ&äyuy?)V̽&{PR~^d:T7n2ɢ4in8͢Wh5/;C%'wxFxcլBd b{)lяap(tYװ3dș|?V&%LׯIF21_L҉ekx\޴l 66iT(0|"iFT]Nw9nAG@0]uPQ"0;z>@GgmRxE5>$NV zBǷ0X6=&38GQQx(m7qhQ8jyF(1Y`f$F, 7yB-), <{"knZhlK}fca&5v?$MU>h6b8r(URQx{{o$\ 2|m oicrɒDE̖@UMI*Ǜ |;m1 Go,yNVEORbiQ:/[,u;';Yw8瓻Hi8j]^ÓnbǓeG|D%5ث1~ƸEtqkeN jUX^z$~x7-C8 @ۻ`l#GSybZ.IA_'-g: ?8'b]_$du'Wg_^IW}h+8Ї]+ gNIC]MדVVN_2Q/=|uתX&k D=Ofŧh{"+ODl}:xu9`QɈt/*!=[i=c8#L*b*V3ٹ{>u ]v`C1R\L7 SJ Psm|EkW`l(5Εor]^D@wi:aVǫ1o9žŽ.Y,ncCyW<3Hj-K9eYVjJ,)JUBj j!5Xבk;jW ZɌH ǕNX+b%Y*ҁ0lC'u&rU*)0Xp ^*QfbT'сk&"^|C* G1ĕ R58$Y%ڪR'j:V3!=6>ܥ\(mb3clWhnEiL<]IkIz^xsP5GSC@G8,z[E--ѕ$1'v蘥7::Ooc cQա8j^osA;Ɲ'o5~5wʑH?w&Ptj %ύD@;Zfvl(,of=a,ߎ(`u1vU eHEf-e-). P? <'@oEo/`@q87$#S?ھP-QǙzc^@cKX2t4;%vY H.[nhdL5XK#`Мg2M%W8Sjzge wd1ԬDĸ2URcρ5V6(4Y5&c:w1H%o~P?"Dp$s-ARG4л'/ku8Ac(򧒈 8qރ:z:VaL6bRtxHN;,ZΔ@X)326ν!C0g 7_~=w%>ށODkBTyasxC*l,T I Uց+kkY\u9W.zF(7)?{b?xW/H7)C4s}VbYx=ybc0  /E׌}ea6ex bpk}0oXU7i]"qw&=@2AߢgP"D_7?#YUycue,A1O._|=n'w`kWUWp~',/0oЅۿ/WuID?nNAvqQl#{]],z2\)Ty+^M}Ue_5{Lfv] U9[2_ /+;_kMs"Ն;d j4O>6?Cw=iFlldYܯkkwTo둦37wӻ|>~ַŻo'p'_V}Q] vVƿ^b*g ~k> ?U uZ:6bP7[ٻFn$U`Cnwn<drzYr$ٞ$wbն%7[bvw@vlE֏dX&ᬵJ9byex*1"f%(Aۇl>41߷'AC {CSLy sn= # hEg>\>YwM=0AK<;+p=hQ +Mc).uU$RU\H\mg9F4w6X:\S# 662G,:XsIXzπhUԩHg)`ܟżM惓UZz4`iyz}f[VjؗM?6Y&w38,>2i)=2Ic&LàNmMm:a;D8Li=>=՜ Z~Ϡ04%==b?:q|R0Y-(:Hk#)]<(T])}0(0( 953S/>;zxG]S^Xñ2Z G׳|d)ʫ[1|Nh=#FTQc"Trl@(*A+d.,2lg͋{p_WaX=e]-To@&% WjYRRTY ]'&Pr!Uwzh~{~u\f;Og8-rZm^2A3;H$ 4(E"q#$3 ML(^+kt{]RSnj,BnfU }Yv 3<sn20\Y{ƥ|.>txX*V(X ؐa xa6r)v ZYMw38=[= E9ܚMoX,N20f]@'tGq0G(T[+V%;p^8 3?Ed9"?APVɽ/SmWNq$VI}R!Pf( FaWdo E0B Yv38|Vf8w{PFW.ۄwp }#D{ecw38ov&u/o\0].w$iHYnBR&)yC}ܧızU N%Ϥ{HgD ’6]%JH0\8T!6y(O|5qƧ-lQꗡ#46ZFi#-PRF5jNc4n{4]_J4ߩ3Ͳ~};ur{XzO1q.8K"σKx!Yad n_,_jfdt6鬖;?@_/]. |$ QPzBt'n"fG L6lW4je v 5;%nt=૴X, nˡ 5VX Is0Ao` gʬ+5IYr"Due#e9(3⻯l09/H\ bU/E\jUyWbFbW0aM_J楈4FUg*,Ց_'$(m{0 Muj r n5{onI=&M 6%,#`E|pGx906|eyj}hWX'q>x=hE L7!R#nu4V"u+eN%3KI)(RN!쫍? Cx*E ιa-rddqV*huyu΁Vq2Jpuz Vcl_YT\ͧW`9c`;aMk(õ^cAk@y{v,f'FMqF7nucxS8ȫiӤSgE6ꫫӸU<烟Jϋ!s@ v+sHp+ &)B\c֥B[}ܛ,=?r~]9~XODP7Sf{)ӟ ѫ߮^t`J2EH/)=鞚\ݳ?u.War{> 4 +N_L 4MWMPn$B?@L W2/F\so^%'SWMCq- Q5ӫɔ]~q:H'XB\j01D( "X)}񋓳;= r$,rƼ2Q %zGu(ۘoPyռ]* 4HIi܂O7 R֥_͏jK/Dv͡-5G9/.`m]V\Kvk"jvr=;]+³=7zΆ7?_gR4/l_d}qqE \?F@zk7eNc8_ y;mtɇA=ӭ5Os@OTV$|~5;?;#Tbrngz\Ͽ xK_ ~D\ԿH*MQGLxegoOx,wI=n/3ڻ %Ρ|kJ!'"JG'r$3"Zբ~r(`/+,o|Kۙu5*h["gƒ8-gP(<żg-3(º1`QќzDK1B1;89J0ψ(}9 e($-uz43CJIQrR#B`K QʲUCdq{1NqTCw Bjn%O-Op-b^8.V_1BjuFn&T}PR s5K:fcN1˪؍IUz5Av!L5]<#X% ̦$ 9a3eT6;B ;UpQ:ɝ1 F (5XAܳ %WP"J}$qS&k+-a %]i3ډ0Av̅ yb*<`j* ia>B ƴt0N8A`DieA\iDje-/B +UT[9CGy-a %O 3/ P0YW1E4)hP(B %#_ ٵK| r[Tڦ[R' VS Q)p%n!,-hl/f\;"NF8=]h hA f(&Owl1 yD3LŖ %kD"9pA #-6)`q b*RٚgpfI_ |wa~L;|ϘUOqx/^3`ٻnW8xZH_46IvjizpPBR./:x9!QJs`,.l2((YzWiaUZKP} q`.cӗ^ô:}QT|@/q;+HH 0Ay0cM?@X#uB ' M+R:"kc GkkQl]J(!_ Ky%إ3'$#Qqz$+THon0<@Yl,oO.ɕ|̍/RW~*=C<ҝχ$iV+.v_k~OQ{Hq&zL㎎ ^DBkۥt(#+ % H䈕 ƉP6.%^˛&k+</_>MO|:"ż{ C..,A;"rGhzixv1 އ_~ " O{lΠDuS~'E tsb<;PSacv=>XX8DL!NLbAsX ՜)Cts=xI׾s?AB]?u9!u[WkdOֈ4}_Ll#;~C JXSݜ<9( ~:⺧'¹cBʘNBf'2<Ş8b>QJ*RuJ'ځl> 2G1LLYܞWoiJ%D.$7WC *Br+t!_!]u@Un4\a72TbLr[:^h҂5]ڡo?=?Mg3ЮN07әն/䥝Xx[L&bxʸFh`c/:ɝ:Ґ7vowi]4J pAlHRtD9F%R%1߱K~z]xrFJS%6C8p%RB JF0n]Jz]J Ƃ+&Yc3uBkk7*c~21q@ v.5q+hS>ES@R@ad$03"&zx9o)N}JOO`$9Q%u rlQ"^v(I$:?9>\c-YLM$Z- ْ-+5lJ zDz8̄]#3K6gNϝV;D!RaTfh+|mx|Y o}r:WD[ɥ4^]ا\oZ_wLCx[VI5 Q菶\]EA)>?^ts+a~6E+tG0Hlue0pb9v,"hh08GU-:*ou8ɖԹ2&16 &8Sk+WcRVoe23N8x+Ga}1~ ^;ˣI>r /?u?IvTE*H[>^;IXus]*g;w [*`Ǫ9DH]ajk.G}?5Ҙ(J6͎eGW~)+ϳ.&@BӼn4N-7a(9zʏc 7Ô`0=G4|Ȏ 'ŁAgj07Xqt6b#P+W7S& _C,a r&uLݱ7OZW4R));ԫYIzWBhwMHo͕m6Ҿ,c7\_ ~~F30 )mH*IMfD"IR aj!| ,eؠFC{G+!jT!ͭ}zD>"ThkW/6|gV6X;(d"zT01oS#^RnIQX̂`2qwI*Ŏwu[H9q1;E@w?G=tGqPԌ̓Vņ%Ez}Eh:S|E%5^p -[QN$ݮNҭZKZ NZIYur;II3Cuh8AEΊ: `(as[Sp2M`q*ϓbn>:xƂr$+iWs/_сdoo0ܓa(8@hښϧWHRUETՂVJ5[iT{T1 R݆tmW(ՅBoQ܉.u2 ncQ*O{nrb|IywE[Dyg?8ݭkFjܣ}cfۮNM]mt5vM]mtO"=kjnۮkncD(r@\`IpE>(N\\Ô+a3h5Scy>z$†$Ked,(f_)aJ}e3Lt_i}=n[P/3_j{egW')IB2`mS^Z暦{; xpPWOuzj[w7^w9?ϻ6grgw+I}y[ot-5=}uc@-t"hQvs,40󏚑{5rW.Qh߳٦uyG8]d9ɓ/;g+ZI{wDrFѢm/jLF_ztjOUv|m;ƒ~1SN|w{ywފEO0G1?}CFF voW ٓ1E޹O̥6ľJOH [|'{qwFfMlfv)Кi(!Z3kf #/V-ۦE\W\m]joGkr@᮰{`A\ `Ɣ !wG?w>mo,R5أ`i30yJ^_coLޓe+}[O>Kcɡ/t o>h|Ǯ .ZS걻NNNwkwƱ0JGiE076]7:? o]O{Y gA[ ztJ(*2'ŽOL{Wzr5[ 6'(/>owο%y_ r?Jl;wR5Kn0oRD L( +XGf3%ztOdEӥET5D*qrw0А.EIfdܥJߓ4vՐj d]*yBjlr!]{ 6涯hd`4>|+7įz{w擋+ tdVTygmF; *idj҇Vjͫ2%7㣣]=?,"7&>LЍK#!X?YĤ:RRƹOH*eSmm*JEGc иűV-vph~(<-^g㹝fMf =n[&EW/ ٗ^NlӻM,gm4=+9Z[[ NeY4^Ȼ:9|Nϖjlpxҿ摒jD2Am]4?p;rnnP=>vq3??}V- eNY4(TW0t`q. Pbp {LPц @Nz%"L{ѻb*$`DHl:M(=ՂAJsx1}A&.^xUi7Iq d6EP1AW~\B%K ,7݂ P?TĄL||c҇>?Csi*Hz!FƄBDaFzf_׬jB MO)(\|M~cp]aN͍`" )-4-UFwj/ 1$ՒQV~^-n@JT L(|Ø(T0ʤR]i&&2NIikE)IAT< eVD4oZ\$a0Ka\9`3%#ML`8K#A#[cp7'QęrX1Pm2EFI|Dwٔ`%kôGhN҄0'VPVJ0.뜷睩G#&nWnw ,fbF:+m7M/d}Y;t\ tGxmjo"!yIkDR+eJx"DdMD4Q#,Jrkiőw#mq ~~}Yl6?{(CdϪjۓԳ`fy#~]7C+tf(AvAm$LA nװh;450TDa8 $-еóaˁGkc~W·z4"PzzuAƯO>Lf65!fO0(4zBhGw%V',!1,Re"ᴦ<=F'`aDoL")c$BC DĜKϤ2іV'`sr;?y_w\H.jXfqwWgë5㣻8`hY'/-3|h*SYJʼnIDb6'HlƆ{K%MljH9$y\Lp35Bȟ7yr8gZKp*"+FS#|Z@ܦ7(h} !x8};km:JSYb4~ʼc,'đ6`x/$k{iuZ]:S!%$1o85\-Bb9-%KΑuuDbå(&媩SL |0KkJą8)?4/0>h'0"O<*mjqUMBt[(Ui%> _󉅣/*E9j j.hij(h:\ y&ZE.8ǧ )U?KeO~>8` 0ѷ~ϊvQ@4i>xree|B&'E$egS2jcyls "lp5"yz[Z+#:/\ו3ef,m8v+ ph0,8W&*<+'OO^fɫ¿)<9\p a#Β %:ϋyYQ)rKQԏEY8p8w~zՏ|ӗߟRfNOzrǗ`.SpIZjgvPX%Y Me ͳF6YY&&߿]X;Or 1!¯qjZHTfC94y(OpPb.F_^|f>1"t!V&ZtڲkhrxO{P+IЄ&.ֱTʰp7kS HSͥMݠ#s& sW#$I"<g PڛeGOo-Y0ҜlO`*;Pz]ČRVˈD[k85jdj~kHcmb)0BjEw.=$ZTl& O*ז/VuB u!Z3k׍^+4.]{YJ(hW4XX$ǡ -Ȯ[F)h !tQYnHZruIuZ㥮v /Ul;'eEq"5q.mوh5Q*myq5ðj ]!}_3ZKp)]=A⒂*M|9+χY>--4b @*}:>fAL\qK*I*C=^{ ud@1v8(Q/pf0 \W?I2qCEY okZaI 5NE<Ρyu\]!.cTFFzVW&'\,"OPVZߗn?nƻ#Gl7 Ԃ 3?"K"`_{Ga H^XQyu(|6  k$a8o{Ae\? 'ʈ/?/bLh}2.;H*u-u~QJ˯c\/f.PMσJD)1@}z$%wzV eFU״UVmbETFF*V`7nxl:`ǻi>Afo^[K<4%I!ќR*WWDI O?Vq]619]lr4䞓Ɓxd[~׋?Gm}r7Yd=me>|X§qsMtY;zmuf{m"MG!`kZ=3)VCMJEl|I{=WV_~uGXJ|#Q5y\ uUKumov],y vDJZ[VncJaeֹ1LMWU7Էȟ@]fc j/l=^BD.8еR>h;.L_J(Vmx7:~h?TL? ϲV?I0:uVNfma&BD M7"2(iԦiFiVnede[p}-$ь[pJ8r[h{rub]ts#kSIEPl(iM 2Җ@(Et'UĨ|QB=K&-thAm]=!ۅ/@ǎf&Qoe|Eפ a.D~0w_i|Ax`- N1PN(ȍK4fgjcvƏw=.XVJY nsʵV=besgߵ񡹑k}BJZ4nljDpykV!Ze}D(m7y#C%Et)oY[xwthf Pʱ]=QI]Yj m&^h:tBz=Y]u=]W"FBWٱm}wӡ+h%TCtEnj>zuE(DW_dJvz ]fntE(JITK{ ]>|tE(V/tJK=U3tEpoւϧ+tIҕ;]$JNWRMtutEn ]/pm("P &:ArJsWlY;tEp ~hABi&uutyR3ff crtE(l:Utwt>cY}D~p~hj?~d͠ރDWkG% Λ++d+tEhPn-4ЕZj]{m("BWV$JjTK v"޴BWc+B)DW'HW0'eCt=++i c+BiDW'HWHXCtRffuWW@G"~m;S522':E8?\ tqM ;^a궖 F=kkcߨgTL)c/'osrrC{Q] ;chӭt4@;B)8˽in#WV*6v"ZNtut-TyKv\|15vtE(2;ޜ1Kj/N[]WyokP2{ЕPsǘ s+{eCP6 ҕyCtEkW3 ]Z!NWRډN3L X!}("fЎ_]ʭ'&:Rk+i ZsC)DW'HW iYCtJf* ]ZNWʉNW^4Nm;tEp}3tefRN,r)uecL5CW7CWVۂ]}}FKWXffځh#쇒OS_]]o^|UPKWGdc,AWvC]ϙ6DW,mnc+BDW'HW9iyCtOwCkP:5 ҕpivz ]WWrkĉNG Vnm3K/%]/Wg?\^,aZ.`p}wWh˼qnds ԼXq,^ޖD'?|2]wywZ7UXw/Mb(sEVϽl/Vdf9[_{񇧜ѿZ*S.. rO]]t^|KCPǶV<ء aOO:jGm+|ج@B"_zcY߾p5;2oIY_7r5*u2z|HlSR!HxR/~] ov\@bVҊ]R]^)m/fí~kF3?o~.Nx3U׿+wN~殿9}S۷2e-t{:Tަ}VռIf6U}RflBy0P(+fiPLzOG/_|Pk[MڟI٣}8y{ϠɟyvxVq?yv@/xr/j`me3M'|O*ӛg_׷AAr C{ EI+ӣIdSt۷#,!.ʇApگ~/Wo0UX̯es54m膶9)mB:_ﹺN /s7}hy9擔ޜd\s 96jgo)݌f!<}&7%[JuTys@EyA77(78pXa9iJ3H^Oo{;YA.\޽\;uM/Mڼ\u)H)'smC{skzlU@_{y#2_`D <1y5٫j,>h)uҨhi)wڈ-wt}"+e- yHRj>mCY,E80/,+>cs-acܪ*MYRYF@0 }y* Beȵ*9aA 3-%/%K2gxTV.ke:E :T2=/j:(U3ΙUYT]kIbׅP#'Zr!F2ÍUИU'a5*Ċ5zh`V^^Vz^rϐyh`-n@ZiiI%$IdDA+\a]k*-Cٚ|AY1/+9FmA3 !2X %$Z,=x!>&.&l!gYE>qu1ɔS,x `lJ} cFA1r/X jYx(!Ց!Ch%G^Q=(.XBieX L9$Xe#j c5 (_xXkAsHSR3+LAAb*M4L9 "a(y 16r2M}`d Hq PW_gDͰg19UL/ڭ˴EQ6f a5FFwUC*Z$D@ri+w%CA`Kƨdj)yA!V Nd \"+0V(+) lڄ{#CYhT0xJS2 sYfHciYD@1˥trXE/PB]^j܁0g ˕3V e6 aځ!J  ȄтH}UTUR: x[#.Uѷ(,,r q*9lLWn_ fd[eЦœQ3E@(EWPjP*fAQ 2|[. (ĩhHLD)^R"1AUDI)c%e Q$2/=ZQ$$쫢 JZ ^8hl,@x@HtvPXoVUWuDM23?iE[+F%ӌYUIBQ@QIbY"it8#`ǀia;/-}cr7Jo:mSk!|b@uP~`Vi>l9!3J'. 2u20Q hyA3 hԠ`BGą1 A1@JP$bB*H2V**CU>hyqLm0_ › q+*p/=]TBN~T} DUfE @t9%$ aH+<= 鞛njXp~| d*,hb⮆D ED+tmpe, z%!z?.-Š}(T&ow}Y]TGe} $$ &)},ƒVJ*(S`cFX8A9DFgH!t9BU$ڜ׃jFP4<kFV-5*/8w:D"4)A3c,@ J39(h  e "vޘH{hÕ,#`y7 s.CpYT24c];|*i5kN'h< 頖9 i,F%53K7 T*t-ދ],OPmw $afs` "}A׌I'9U э!KTZZAzt}>//b{aͤu&AYA WH7]pFL-Fg`*cӿ{BKQ.0kH繨)9% k bZX~ AdZ{F_ 4z?4f3 l]t0ݍz4X,y$َ{R˶[6cэSUN[CVe;Á :%$ /a, 9 9C tytC-|-5$V)aKh2HA(HyTA*|QZ \+`5d}z kdc4B ?z>_!T.ƛZAK9j֐iB o@PXj 0вR E   iM aF[s|Ȓ[[=Ԛ̐zJ8K\s*ВJ P@ \4ji5fC1IXfy b٥`tJPHpd$8DԥM- Pq9- Fi;t^QxwEB@z3!z c+yv B۵q<͗6kâ)A@vXtB@UA" Ғ]%0־0͏򀳂1PBvbQv} Z]rR]OJLW<40\愗2'L/BZ|Y76 )\;alo\57_7+[FQ${;+:8AIa8ą%ǟz9YzC5Zovs u{ }2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE%1\\:cZJ W Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\[,~ ;Õe-wo WN pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"Aw5]2\`cwphpd:B $pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wdjp x Wڊ7pE/q1u·p;s{C/a=k C0 aשSC!JA/: ؽW}K3+ ]!\ϻBW'١TJ]}7t1j'V=npb?tJXEWnrDWmzqMY { Qۙ^WJ]\\W ʃWWR3#+ɭ]+, ]!$<]!Jo@[%:DW؉IЁJ n]`˺3ՎpEg ʃBk1Ktu^ T ^׽΃q4izܖJTM|T_Tlr1߼\jmh_|,s"PF=5PXQ.R<|z3JH!p|Z1Yh o3D1f4pm_ɺs7#Zy2DWGHW(X +;xڃw JG ]iՎ͞ڴ#\dp7[G%?,'fODWiz޼=Ctmw zbOs4DWGHW*y :CWWU QFtut%u+dWm{~hӎhDJ]!]g+|g =rNWН+/ [{DXh[=h;Xk./rŁKJ:PiU^ Յ.,-Ϳ޿ 9 p7[. (UW>8W)t U4J+\q_~3ߞ sW?OvOW1=?Qܫ &~-epMWy#8a,y! W\/|hJ@4W qϪ^hׯ˲*idj|U=4zLi2\g!-RfHiYēsLMd%1eS`u͍+s裫2ܜ Ƨ}p,t.+u*hӳ Cqy 2+Ea.˜T^Gns!xZi e^?QX dbt0f4{22~;j /`2G}5^k9nxfgCVStVx?.'UnR.ۀ*LNӒ`,ߓU&'4fn-6]sTBJ{x)F.w5u;~eyGJD( z7NeRr2SQ<^x6µOFVh39Ysߎx)w-pb Ņ=a*y jGΦx"h]ɠa@NC>kzg p%[oʬwc%))(FmPC^]1W +}|al9lY=Äb6$j&zZbuPz0;]k?TFb<:A:Ɩ9M kZQ7$Am){Fa)dmB0Ϗ6߭l%]Fac Hw}VF5ƌ^VI}mh|=Zo5مؿll&l6B@|O^1p[a ;GISpTiPhaPJjOJhNeIuQN7n&3 Jή3R%*J*93I)o|ж啵:,!NhVrV CAǭa&nU-MY`x'('inKd* Rr)2/ . !4Ϭyř ΋F S LPm.w2G(M3{9z#2x9;*NAR~F1-7o|':Oy%:ǜqE\O5hZfF.!d>.밡J)*{ː49A&V [ղN Wx4UPy…Qtb `y<<p& p?Qqs6Y;Sc:da$YetR|t%KK\ ZC鈤Kk)_khzRG*)שZ)h B'A\KŬʥsָ6gHvmw_ ZoqRٰq%~ޏomE pfx9Tds8;3"ˬ7nqR2)$s`1OxvEH?83ype{j޳{⡇I>%85矨 )&8㒉 v)A鴜t$x~RZ[ +AAdGc%¼&f] W-.7P!¡FQ͚׷raU;3ŋ哟M"4D% \ީg\x8 z)sp=?T9vå*]uM]gWc }<of_ESڧxzSѱk)ooO7_7Y'c(~%VTxE+xz8hi0-+R)@U*Ft0Aկ?|{ח/_}wI<_]yۓv"~9`tmO{EEc 孊F)QǬr|\cuVJb?G`W\LxcvHTzMφTo\ CMQ WOWW| v/w= 27x<2#Уd-{3X?gW?Ǔ.Qp D'& ]( -$suAU4RfZٰ{J {PC"PT>eO7[W=y;{Js~€\``sӘ<JMT1a-rIq' iRB!,L;筦[R3,c =I}E-ΫHBex}E(𝻮T\m~v[i :WysfQ6_% ԥ =yU= nW_ݎ )azvGZ Xp6]x;;]8^p8쨺d<<-˴O<{{q)% 2S#!KZy$W%FWQ_ ir6d\Y+WV q+uJAI#WiXxs7õ]jDlGQ2ǒyrE\kzF#`+BZ\+,Y\uG8X FW{]!ծPʊ;r%ڧŠ \!ҾRw,UJRʹO[\n5,\!-3R\)Zy$W "7r"W@Kr\if8a7٣Y<ʒ^4>>4blɱ *`(9Fh`2orrrб~wzÛoOl%MW'Yj)z1,e"Yi| #6qe:l0 V~|5֕e뼍)oCp UuJ*HJ0Fpo@krŹ&mr,r%ax$W̩FyDZ\+"UJ N,HZ prcWH)B쪋r0Vy$Wl?v'Ԏƺ.WJHB\uQ0 )'vmjFPRUH*OOX\!.'*\!A:(WVJn| PlE\M h%w>ԎRz1rkL0o}U3\޲\5mF9&W\ WVVTH Fj_ hR2䪃r' M 6m;q3Z&\+lrJ2#4x#W++U{WH\uQ] eHڛ2\!4Rj䪃r%5aldh .%Wk/r8]!%A:(WJsQf<*c/Y,xCzaPuodeY9'XYVp'IG([{i^Kz)?_dc6$4+\B0L$tXnx,R\- -)R2,;(: -'l›-OH(6:7j}g";Z Iف/NO|).|Mƃ|1_Exvx<.N( 7.HjYkYQNw/)_"aPSḦ1\ڲW'f۷8_f|gjkϓVĥ֗9iw}BJiX(\ecC\}+}+);reg#B`++lEZ\+t-PJ4#k[3`KMpU/555-2ތҵh W"վ7{$Wl7rR_ ip]5*ճ3VH\$B\F|+uB W+n>-# p92\!kzVRb7f{#WK/r̸.WHZ" W"Wje}Xs qEH )m.ʕ&zB`O q7O\A(Wx\ e++BZAD\Y"kʼ+eބڑV8d)e^\ɚ}Bp`W{3`Ӷw ׶-joFK3+ȕ r)%s FW _ i-q] 䪃r(x$W,[_ 6BZe]+,USiOrLh2oHV .ʕq\!g1H_ i+:U ,\ MR^5HI/ؠoTVNr30 ᜼6a9-rfyr'Zخ46TYF1*,G4jp=cpmֲUeP.DTpbIuѾˡ1__Wb5YJ߯=gO9c,A `]*:[+[<l!Gze'F=V)Ӣ1ʣ`6| e72TezN)] ~fY|S<h_Σ rr[NߪǪBslָQ]!O8uC"Y{GI!zH2K5Dm# oԮIJxu{KR*X|'R46cyI,QƝʖv0J 엜DҌe [Nfގ2[iG2ꔹj& KXXuÕ6n}h^.t:{iO^V~oNJʮTH|VxZq˻Yq^yVũm6m2'=,Nj\ɫKWzuZUwj9qaћڝ'irګw/ b8pћ:8FCɶ1?CVcશNjFˬrB\1AV F+BZI\+T4U H\ͅ/ry OgT(B~_e*ퟛ,HUFK{q҃cFgdɺ5}P9?7G!HUEzr7NTRZ  $Y1ڢDw$ףI Y^ܷJ0%(mGZdjڤТ-)$qu$~JaE1U>) X )$Z4XLզlR\=D@QOI=i^,>YЇF.\>E=uk,R`榤gE¥: V@&Y)dWH! Q S( ٥fّ#DȗBi`L9%X3rVb{UPQAm> h-͡]K68U& ' %jɭZ<ĪuUJ|Yɠ-],@8քBGWXLANеYec]Q?s%xYPUSh(Xa=Ҵ1 k9 ژ9TUAP#{*IUJ'0'UX\lG?5mB7BJJJ#L2b!]AА(8I#`AV$*(k@oB&uT_PN+$_,l*!:@@=-V(!Ȯ܁pYV ++5 e2aͷ6Q)j ȄكHc7uӝ1(QdFtQ<t,,4C7a?%ӌ/%RC%:&|B(Z\AAN:2]:Xi65lD]`#)#ǍEUPfҞ5ƒ#%d~GBY'& Jۡr{FjF'U]%їTh``3/oHH&9%R,"pPBi5%Y@I"b C$CuH*мGwU+czh2& ԙy?/twhH+fD$' GcE( /0K0L,7]^Ҳ2]Wx_ UA׮z|ou/C&>Hh"X 3 o b9PT8xi76BOБV%SdIW=$ B+X(ITyt_V 5ȤuՅ:@vdm^XTu~R% 9ՠhKP-"V1hFy¶e@TD8X!?%?=%~~̻62#bpPaRDluHmk"* F m},-h'r%5K'ad*2(eFAj0F-HOOdPJ;HUߴYaKl+TO XW6b]MA57Z`'] +U@ &eCE5FHmr3u#&zށ U @ +IU$Z54)hc37 3E5iփ*M3| R5gҼLFALPha3BBvZx:-] v%D7lajlJ!@6(=<V@P8U8P(-l4EO(I}!1?Д n F5> IkO^(T %׆.I@ b5 q!X7i̦rͰ\"AC, c4&ɒ\!T$8uY:|⤀%a4^PFU~B:CwW4"坩y0B.8!G&w[h#u߼ۋ7VaoXv-)TKZNX(%U.;$*woQPT%f[%?p_F\"ULbdXVRn[lJXm.vtIO]rUzڽ} ]Fo5.lwj{Ty!5nnvH+./k^ bƊxaf{sB{h۝m-~M+qhz0~h $N!z.1L F ` zNN +Ō#9ZqN tl\ޫO~\VSj=\7Z_z Z9$mq pq h<J w|:1tu\hAhQBm<- U`ҡDgDWث q.t%P:tut&̈p+ R̅NWЕǬLJW8گxGmx"N3]!]S1Nf.tEh?u"39 g0NEW7ЕKN^]Jiΐ Ҩf6tEpl :}"LWgIWJv<"z6K֪S+B9Uچ9줛 ]\5+BO~PVWHWZl%ͅ6WWRU̡O^A~zЁpM AɄs tC/@9#`w~U(їNWԊABW DW촘 ]\+BWS+BY]#]iG̈h5"f6:qtE(X+S̈GD͇"Ɯ:]JΐygDW8pL ]Y:P*tutzx"nuI0mZ׼y3Նt\#ͺ>EG8X$mY#Po;UǺJ7ݪlݿV^jEJ[Zz =,wWvuwa-hwù۷BbjuuNGOE ]~$VoeyUSth]FoͲmŃu?6xeKۡZظrvZI-c}IjOe u5fX+-\*BkW4n8+shs=WFQ6_V~$]B~FF#*ޞŠAq(n6Jp fQiOC {nȡ >]S|׊u'OWs>pp|Z{;LWB|瞊8)6G-F_@^P}.]K^_@Q;"NͅmNW@)X+;C㚇{ (I=pkѕ1#"}$֡p9}PZs+:CkfCWWŹ5'vE(a:CrB8bq;΅+Biy,*&F)~kssUӮ}4{dpe^]yy}/ߋ/01rNs+뿯}OQo:DiD>ȶ/<_r}d؛Vo.lSmnwz_&̓{n*g?_V0Gbh`q;w?wpgۤ>^n~X{6?7;"m[MZj%F^ˤHochER᪮Ƥ(α0{t]B:c ^o{87h|{7͑^x{Y!,z$nxy}syqe{+:$qc";6yn Ȧ-hZ, k,D$sF-9-KM۸f4|H9!gއBi=@Iml\ h vo~> F%/F[.ufgx8l2cIKb;7OV[G/jj ;ЯGl~z~kӛjlWj;T OWǫYL?P0X!5hbYg?OHlI׫Ofh?jG%S㛛lݱ:a78"Bt֛?p>NP~:CϾ#z?&S|tz99;=`NϜ?$GWMAdiy/zd} Оzsgѭ>֦5*kuMQsCWUlWn΀IUaR6%!`Ek"SkTk#SD4"#L rɝLh|ڿ8i<+9d ,TXAG'~|؂Tj>~Xi7זb8W#ADAET~geaֳ'j'fA7FJcҞfJ6 ]68ԕ2%jd'O K<,CFǡe&Oq|Da՟ccJㇷ-=y4\BRO~4A?CmK̓ €肶D#b8DjŁAiѳZ,dXn%xync[3D>16t$ _ H*Q'ey(ʣGV3ϊh8rF6Afc:F(JfXC;Ѱ5gK?ϯjCPs6.`Tvce UUѱ h)h s&]k9Qo{G4(M ʕځP66fͭT|XId!s}se!A=I?I TV`㲃MYP|I(Jeem͈ qh;zr(ט-reNdJUDGZ(#ÛܮX<3K{`pXκy0 G93a"B.`#8kctE<-),s30v\CVc*1"%(/q,a>h ~=@ŚJEHfdvZ4JѮ:Gj֑TX,ՀR[."om|6ȽQ? }3B;Ag~{?H09ځJ3"7PU2>&>濝̤o>rY\ _<f<ޝ`'4%O`pKvh!P$_I$\5,{U Gr9ؼoO v~[<|qSN=wk"odN/.Wά Y $nux)rj++ђ-G7:nUev˫Wl %- ?,Y w۶KEhjßyYqyB2$6$.6̯-fX^#<-HH#h8hmiT`R7Uu*Xs4 W)?lڇ[e~?0NX!~rO"=i]/-`Чxzih|11OzquJ4-Y^__շG?_|~^`߾ǫit8F~yßೢO;*[-ܦhd EV4Y n^qB`Ӽw~} I 7Fw5ɩjTJO2DD}qe*,*ϜJ -JEo'~rhtDR; Z1! )Om0QK^SNU*&.%]=Ȃ pWbW(t*G0TpYRʮSe_tʕ<I$\cnE^јM|%+T;5KOz(hW]crHU.UBPItPvUt/)4A_;+ͷ սѢ{Ńo\95g0˃mGZ՚8wT߈Ch=8cۈl[v=?aeNWVli~j7헮vCk~j'\â+]D KZCWWUOWRh]/ZCWl[*~tE(;:J1""¶n{ABPJRy9]\!BWVC+Di%Jц "B3 4DWV<]J::B҂YeZDWغu+D=t"\wtute@[ᅳ#nM0Ǜ臔xYoC=%8GW*c6Bo-s8֪ ۣ~&<~U|Խ ~V[ yZdΓ /]2OeUD\.T1woV. )et|9(OrE:Wla7nJVy^e.K%#sUPU(!lz&d1 ,U(YEk@zx2Bަ-,uk"Z Z~yP"#hmμMX{n ]Zy0Rwtut儳MtEM{:h ]!Z١\gt[v=0:ht ``[U{=;@WcKe5s5tťf\0NWR@ZDWXn ]\BWv_ٻ]#] ZCWWɶ5PZҕTΘd=k ]Zeq]!])&uW++Z3wEh1Θ Ҭ5tp5k }tE(e2xte&"ζ."_4]#]YFFЕՆOd ::BrV "` \mBWBClGWm+e׋f]ر}n/PT沅K]A?k]8-97C~K#n87iLyҳM_p"02uqųEMnˋnGORO{ Q5Lj:8$Y4!G#. @GN.up 3)CipuP.yb[cR;ĬW逰 a5jjRz\Tkn:1+grاqGC" |' jKe-րͪ;h. 欗-VNC N?kOt^WEl.2E=ڢCk>4y*矇 vq; N2_3b95.[VVk hd3u`.~yNB7ǍsT5\̡|? b; y|{@o//ed/ MYဖqltONڧ`O mwUW 98΂V{9ZUf@ZdTDvŇZ2څ|b:yQ*[ `/ei{ංff9҇p$QzAfA30WBY9FܬtH儽Cq%5tt0 i$n.Uj/ŢP79ɦja{-KaGEfw,86Vr~(n4kEUx9=A_TVްW,po7'a%SpW݆_tʥ;wli`re_Hҡai:++PxN6`* pY*d@cVQaSgƼ` $S5T xj]K@]V58uTTBZƫG 3 6;WHCsK~ =Ǜ7D <ۈ|\i|T,ش$F08>פDaBƬx,Rg21اmH 2NX zGI4 GZ7F&gn p#O}2S+0:/u$WYpSqHNVyYDZGxF;RL0J(w WVw޾7UYW\b{cpg15l{ťlXwtq`=q]z qdZ0.PHx3jg>N(RFXp4}y8r}9iI6Hvm}?; :6kJ${ƶUl z4ALDfKE"W?ÀGpa&T责S{IqC~҂%7ih 5(§Yn.m];[oQ:cF4Ɩ9D@n@\""n5dO9DŽRn&x9)TGEdQ0A[ AHCƶCsuzH{r}c.x^d⧴KKgD{O)ݛ'v?BJ`Ou 23T<0øs'& fu6@W†Xb R0 lFO(vx(_Ӣ o oe5f_/(ZP>Jd_ÝG'n=ufdOXTZ%\_RJD+a|4LzU"؃{D<-P9עqֿuDx8[Iu3&+7$t‡&;^7Pv. {hn3ѭFQBak"X`,b.PË(QFL)8FҎYWp-aj^j "ZFx]Llta_w<( , x~;&Hxt òd m6l'8(‰znA1d*44먭H:_uLasu#Tx] RrK/fE!Jb,A#EHc%P 0ԫ)h2.uO2m<0Q$J}F}%!hrD+圦)kXБ݋:_x/w55mH肌n%Xί!$( `ḢITԮ"5e]v7vo\&?뇬 vE%=du:9Iށ7)">LSb$wq}H2#ՑŐN{Nq{qNX4+SV_;?V7N'' Gl1%17T/y(QNe3\0OB - [45C:Uay ðlp0q1Ű6gtl7M.tyN6WJ]La#ab\ aZYwqgERc﹅ũ]r?.m=;*PҖKSQ xTujܬM5Q%%] ] ۙN@IϏ<}~zo~<~1&Gǯ| IpiN# bg~ىw B( GNCRF,8w()^ w::2y_b5ӜH, xp҃,#K Ls7zN=JAZY*d:Q|FZ2,k^]AI,.E=YY?+YV:6`}[1Ę_׷{U GNڟϾ?Wgsun\|?W'=Һ7\y!Υf lQk9J8w$҃9ԪmQ9Ⱦu8 >j^jA Iwjr hųV/Y' a+c4.[ٷ6DN&ΧoiV sxr9XDGL$Hz~FChK'-JK2- &ۅ9^c 9giq Iģo7,͗1&'ZA۠FH"IHH˙eFH8v`-r,'Ex+0bh绲>-ѳTp~} J%TPE,vi.]b :(MH;@9H, X1brҪ润-_ ӟTJeQo_$3AJa @Y*CU=3(* %B# @%HdD@T8G"ONqT!H% ;CvֳR^mA%F1 dÔGAHNH J@112ؽ _ vijN!A8TyDs&cOzY%Y q$ٔL 6Zd`ǔl GqWcrWv&!7ڲvo1:-LkW^߶Vɫ"א _5OH'cS2 G1Z9e T=䶙[ƀJ-c[@`W-b1mBDs\rzKg^gBqX0xvcQ *a$Vy&r ~/pk-t˜KׂI@D.%&jW>PHzIHʶzI:M'~}(c5YiQgi?cuSU*:f'pS#{ 9 6k'I볡Ke߽??_VW`d@򿺒F <3w S&v9( L2jz!4Red<0]4qD Z8vn%m`{AV\| Ɲ@Z/9>wXH;b@ĂEF[_/AZnNelHSHޅbkK⁳yi_qcphPQ`NMfFӑsgq\ &o!݃.|>!]vg_D4&DU4F+}2eN]zfGΤSw2Fnj+SkmH_?!d/% NnB?%p(aU )#q1" F&F|?x=t21Zn’|k\V>3LԁBc[ʄڋbu{\#3=:sH͎"-G w8sx?n>k$u" rdݣȥQqugcױ_Tc:+(oft0M%h̯Z(%7i3fXp2Wxܣ#[ }RtWٕG[S't\k(Z"XCT D8˅!R K!"V#)Pʍ"O7J#"( -J)~HZ`nV$-C:,ɼmMa +rv*~u:SgiYqὛ>w[/{d#AH֡O5P*zEL@Gu'(eV)ztkТlEsIӚ >>Fϭu'aH=0*0[e #-G--zoeAQ@eݽ?m+[\],qy4<03mMxf23Ńxs:k ڵZ*e {vET#d C , h%:PȵH t;Ö/mL]U5?]#v.=:::bgÍLDl>`ɘ 83kXBjg5],k̲C'^F3XsI)Bdx@GXJF l5>qK i^f^jƃX]<(CMc= }@"b\c(374 0y4w0 `.Sٿė&Mo! A42i5γ5gjeͪ(s' ~@>łr@-5Rm!ye0ek"72}Ց,\G9.r/b$?3(у_)JEqA8,9mWݜJ H^hFwm9|D٦F,rO#>`8U#p",#V8~#:`Ar%g}.F͍&A

3ʫRz29ljxNynPSƌki^mdTHmo-.n|B!O:D0nĆnN 2nss AL\>}zkĎ>N\?SFq|9jtÁO.ǁyttF.?0tpXqs' iw%\A._dm܇qceͨ-%,@`:ϴ,sJÃ&>sk-*4RLY$Ҿ9lR꒵S],NUֹUsۑ/Ll9*3x/L>l0X6LravֱoɧqŪoT[%A>k5C>~!PK9%v$x--+9^[>Hy^:T+L9iq("8m ,%&  JIM'\ImD>x4>*dQ$y4  KXIXH`ORaE4}Q"!&"6Q Ƶ]_5rv'MN>>.,q aV \6("m!vW`TDrrȃ:=O7@ H)&E`YH(o^Hj4Ef_~e`,VPNg7GH߀FQl,Qz)nq]ⲅ)np0sHg[E‰uXTy>Z:]qC:? Z0.PHx3jg>N(RFXp4ޏV^GXn*%`;>t!3=V?Ov5mخ C#A=͖33ڂz >>W 9)yrK!DEL$Hzl=zr>~Mr }yy-9EK=ngi2Ur7V)-&"2- k:0GkLIā+u'DJ|)yaAhBJAY&Dn3ˌ3qe]k%l\v Z簾 c-ߟ ֔"t'fZjy髋K7w2OeӣM'0:bqcEqP;APA KH5ꠣ4 e@9H, X1brҪ^s[ܬP4׳sXR*zK,Eƃ % k4TbzfP4TJF9BKȈ&*HΑC*-Yk4ԳB|jVrCRMQ)$*Hr.ЄrJ`dzOX+$VkPFIYO:.~x8{ a{R*XE= TrmNqEqBE HV Gf7{MKr[x}SJ֜!nko/78( +azQ(&PxSkU$g2B0k[{vYΟ&"OU>ҲJ* :A`6" 3ֈqL&(pGz 0Vz&њ&bȺ o^%7~O3S1~ ד~ 9?M~#ўɝIRo^ DVгb^!g}WhW_UA $]g)<ۤի%k&%?ȳ mۙ*de&5lYp .#‰.6#,hc*U4h1hQ[ ::_u sY䞋TxGDJNc@q[̰4N+ )T!ʓH&v? @^qU3T%/O, sabmJjK($mUh9eM:Z8Ҵ~7vjijH肌n%X_CHP2$50IS&{EjWrTWj k_}%AF=iHwR/kS_0(ڛ:+8L"읕VeDD`#!H)#^T;ik/7u-3j{Hyza '=ltRȮD=`%APB=dⅤW<(CAB7`><8apSee&_Y%Rj3f4,60c/5Jͮ ['_oqEg.hzbw-/=:Ң"M]é iuOQW)zZxz?'*?!D!e!x0k5f,`ZFL&ZIxSG:os9͏T*ӻtdi93m`bnHyn $9jipjtƩ< U^rÜQ [XEnC.5dnӟw8%TS]G֎["XV(U9guF*raK!"d;f8M(rneZGEdQ0A[R.k%|)$4fsw'$_hN+)` V37i pɐ7Z*?Q;lz-Z/e6`vOym8ˢxzDPr&4#&Z.ON$!j[<+D/jOP fh/#Et$LEt!"'Ubvz[eOH"88fSa1 A0 ('`! =rl4&=pt2 ɘV!!: $H[XA`Irl:Vj$fXrXӲN:IkSҎZZyZ9~l}7ta{ S/o-2~Be.ḭOၱXhT `sxn yRA襃ы 2PԚbON)D1#d RygPi 1`nc^iGj Y@SJx->c$;C$$ B: %jJ~mX&̟uScZ<ۙ mn. &c H^rТ$Q -9+`"4aw iDu[O x) }zVX6H} BZCcM{_25;)Le'Ws2q\_BoN)vKnCo0J ߧpao ϥ ra-\9/HjWspq`Wd,k4m>wpcz8a{ ajcUM#Xja]SRRQl5K{yQ脫$a~ n4iٙ)!}U UțW?f6)pp=muǧS8]^?n]HJok(S/™3 eг ,QTg)grԝ9|%xSPBV ?*NBg.g%",+8ܻ_ya(uLݏuR7׫^zvҳ_7POT}W~ pR)@Oա^׃gW|;=g=(yWoƣϠuqzO=FZ,@d47Kf#▱ť$.3<^Y,Cjyڨ$Vՙa62Yt f IUp;oB!& cF@1gz NVFߙ;wb3yMZL;{iB[_g 4t3/a:#ۢQ Y{LDZdz>/*dgRI '+Xh`MEDD6N9^?=-~՗fkx?+bXĚX\:6N)ђ@h bטjtsF&D|$_Jpp^׌V@i6(҄HRFi9I1/g2GldALp&qK?B5RG5Օ7/|HSzoDUXk} oH #ck 2Vk X>R$!zIss4N s|pI8IX Fv~9>w Iԫួ|j!|k.K0퍾gC 8Y>0Ln z|0?\;){?r](b{+dMyv9F"rv.ڙLµܝM@{O%P0 8+:mqkB*DƥCReyw.f).4ucqNF&5aRz?ލ)ɌJ@7_)_>^,e,_+FΧə)h~qdÿnaJ*P&?4{W=܌gvz5n g3k˫TvQ6vZ }wes C%[b|}KM͐f8mlfY> 0: (VppWzrf?48![%hs Zm+%ޏuZFR/K?g7N})c%ﻹ)_Kޏ_KO͡'\l~Z}F*GҫޮӮWmN[5s+?Ç}ۋ}D]eZV>kع wݏOhjګmӴ9hWќCnh/*-g]R(M~n[d%r\mA(;C@ViBl4I r$T)L1pRjosTmDжVKr @%):("'JqTQ)N^cXDcZ)"VB"Iy`VVI. 5'E*Z+'c%su6\N7;i+lntrORH)6 tBQVrPhXLo(1㦰Hq CIO·d8NZ+Ick\q3};Nzd0%X`T9F$㑅H"RSFDD b FрG!eLD:0]! RXoϳȡuY.|anW33V~i{w*~ŝ9))i 1(r~Ηx),8M%)K$GQ4H`\ AA DzmEx+dk򡞆dMx2hɍƴb޿9Ð T0ާ|_BHkpO3>'b遫U35y{Lz0wp/N#Xңn GP)a*Qq0Ȕ D talșzo=COs gk?poVM]!rz=%ȺvOu3xW^OkΏwz껃D7<3{ {à4ieA@գnt/7a2'59w)QHBzݰL}05#o_fəQ@:;Q2!u_a|*fXgVͧW?xų:G="$ǝN~{\\/,݌Oz A ERO鿿AwknDoCf? jStjxB&ާ)dbD[;BM*8gy $0HR?WEABz4ȵ)A r2QO 5) k cߙS<4)_?wp$pA}A{C~R P&{Jyr?DN`DT*jTf#) ۘ&1)5 8k;.aH̗E0G.eJWe6} Z |b֮bhF T3 '? qB<YHQlV|_ NƟc |ϹF-EB-EB-EB-"j0US)?^og^ AHRVa 2LwZ /`ZFL&Z܊T r/=w4<̆ yFw(׷,n9_Z3TIꈮ)nygWLK6y.^;Bchk_' ;lxvI1/\;ޖNZXu_NXhв!>{?XbƟG㧉R)m//'^"W\LDdm/Q϶dKu|t ,s+86 IcF1 )`)$Ť?W?8m=RJ=ZSv)(ƚcD} RygP(i P0vY4a#A29HysQ 9K>cm9zk^I&m'v}lۍn7e[DNOKEޙ;3Ghߙ 1 v2$L`1X1IKZR0|%g If$uŧ'K^ #앵58,K8y44քɧLl 2?Fws2? :[J &xBa0fˇ0~y'=70Y}BR>_Fӿ+|%{znQ^Ʒi}P DmR ft Nm4=:T&xj8ݍRU֡P2㺸m 7ł9 ˭ʼnuXT^jdEW~%,\(wp)B0cF@,UP1b"iZ+Iqց8^j^jIH#x直@f_,?W "PDPIPh{wδ0C0)p?9- -AƒEiCHP$]o(6uk ~i-,r+x"T D8˅!R K!"V7)Pʍ"O֑aY;X+\$JGƶ+2F U-tNd OU;-!7=9PPywOƣ\B?$=WTۿld2g̖ٹ mvqs;D Ŏ/z6$ض,vg^Lj,;]\Sk˴a;i{NN+^6 3eG#B#&RzU7&XxM <  5~HI!cVXa4Lp}Zve6*otd&#QHYu0k5f,`ZFL&Z ihѿG2eO軔! b <* s~]`XZʩR ˽)7N cx9+Cafg95vD)' ҐKoUK/GkQ:.brK%bdF/cL)lE^q5_fKCf~+4w;}*/f8Mg(N)PQk %I/P!`V0/2ZDJGt`ZD"rRV^VIXBg `*S:$ rl-!\0Gd1mHس%^z{E2FHH5pFBb%#1 )DXҠiCAR#1s)wS2ְhKZ/iS.ZŽ [jAuYEm.b8AWDr>HO" mspVo疼pBm|k"/" &$eyx[ymA~_iKNEKsE[s㩆.` E.RxOsT|qDBb/$ɗ1&%;V@i6(҄HR(i9I1 0LYZ#g9Сgi 8Q]Vw( MN!H6i=$f^(yGCIr D.`0"V"$$bň!IzmMr']sJltV8C8-$ţK*ֆiJ#7e4V̠h$00rNMȈ&*HΑC*-Yk(gY~{bVqFH4i_ÔGAHN?.Z zcbd)_ Wz=}bSHNU\ ϕ: 5ʥYC63߃?fHMP xsRY\I6F!QI}10kRbq۳g.=r ~޴*xwي3_cP; B#L9F'RP ډÕߞ' NB_g՟J`PVtb քT}K esmy0$1fqh|S;}nkS)ϫM+$t` k?>qނ)-$-m>܀LeFsS+eKk0#Z>ySxyi39ChLm)^7զii88"Fa0,d`0a1i[!z1c(ɡb$ }}1]o:&2K~V;G@QIM?v2wVZe|Q(`KD9 AJ,LŹFIRΩ{H% P!9|e`*@3D8% )!Hs7zN=JAtPBTrc4+?(sV؀]Փ[ n\6oUmƧg tu&gI#WVt(ڜ-r fPk`Lv{֤s!gT՟Dk iLz`]М%`k U cJ#68R$#7g)Oc.*5Xd.ɐY10묌Yv^9 oCSOaMཤ ?F0@NlۀB wW/`E`_Nvٴ`r8}i܇~V}-km0(ڬl;K,-3K+}&`Ҧgg?'oMTࣟpty08ճ |P%Ao ?YlN_]6[x%ghl}A[a*:~.?*PlY9/EjjY1LnFq)|`?$LME{/>=ɀhhק [O0C5pI48m7 3 ffP9  };C /mr`z&%&Xq$F0 kÄRf+`"{&}0E4^J 2Nh64?o s!]ߘT|\Ҥ UÛq"Up.Նw^zRc/lYp0J<2%C(K9أ(Ba$*㠈rg "6¼SgK]0)26TZ|eGC'  ,YKј5myp E+oԿi!)%7D 33 "Νڛ켘6 (жI*(,Ct0|hq(Pʍ0y*[ A1,"1rbrX+7B"IM ڦ5sꕬǔ!x[b#gmC>m4= 4.QDF%Cƍ+.A&LݬY`^ :y)U8J̸-8gtS|0Kjdnp38ZN*X>YqG3L #z!Uxd!R,HHԔ1тFQ4`pHn0]A`?>Nb9V":,w|nn&; Ǵ{/x tqPW.xͭ'$~k^J~' uJr`~q;q皋;Zrb<ہq8ξ{6%p*gMg1VR˅Zi䕙x !LQKd2J2 JO0DYAL_^K7\l`B)$__Cr׼~u,bw&]'UZ[Ajw ҅MޮӼQ^oȏfYM77XH>u`hp`%J7bi1g;Ja8$1*wN7K1C/pGU QUh7gNǗ×qçcϺ |uČ1#3<0Z^VLHjB$U {Ԧ ڨmq/b}'Hʭ;I7: -qP.Li[S6UkX<_ߑ<ԳWZqbufU\]qUFJܛvB^5$iaR}s[3MoSG6u 뤳j}M16t&v/Ɗw'1oCb__&/[Nw9-w¡aE|йM_O7O2ǟ$s+"]mʬPD/a_6?%7ih 5(O+Lwk%Ē3Fo]">WG(+׉>m_uű'IWl;+mH_?2R՗cu%yA5>e)!);~W=tꁺ~F6>8L͔A(ѐ0%dQ 6jkI)Glx߇}b2Ve31g9cTT p \)-F͍Dc.U.C!B2ᴬV%J.%IKOt_୉H{JDkv}BlXR-c,+ȍ6L5oȒW`E'=T:6{tl`)'!A)0I I]::)lK8J#N`!iNjM>;hc'Rm.|X+ZX,^SnR Ri#꘎S4riol 9rqz{T*#D9"sMPR/A[1灩:颫^_),rT1yeL .AB霹,ǸV9kdNggurg[{T*AI. o.,Ho1m(oB=A=[$NNS3?~o,ml>&3Y0ncKU;5KRWSp_t֌ݍ:pB>X w"9L)A PȌ>pxr\:"D,3L]_Co ǘ/E)10K,dr퍜-N}4aOLGLw@ '=$+߼Q֭1}({|ש,rW(c1 :23:H ~. @"sBOI-i61Q\iNHœHjBThBR~ edZA8>&j'е#+[> O@VYS$9AiBKPIe9 $Eqtu2 gm &>O\XM:;8/:b}߽ x+HL YLːe'r 4r &lrD3 0\)UmglJ)ZcX'ze-Z(4lԂfT h 4@q)<ȳNZYXUO%{7< kr0IΌHְD_ 6(1:ɢ -<#^z>jkJ0d<^|dz!Jk*e @L-`JU>{[KOԖO̥h q0+?WRl޾A8;E񆬹rxDq6l)b &|>P{ϋf:ϳO].;)?!OH<1Y_k%c8K&޳c$vS< 錔] =g1 Kge"37 B&$-jf.ѬϹ~>a?wx>x[E%zpvRYFpI9 p6ĭ!߿Ք``jJ;-/nFge~U{㛳Ņl1 1Qϭts("Vx6kO|G01f#eIMX"$1&s4!LTL\f#KFj=wd 1+ tB$2J%)ð$92*dR2I Cz B=|x3WK(V]ye|[qpN461Κߓ$_M%FJN7`͡s>0vgU2#> 8^Zvmųώ 'V<{}HkOHKWFʂذiQPѕ*n`H4>fDc\~VHq>'Wgl'9we-`'2 & Ε02tUʐMczIR(IJ*醔 pM#j*$V|,̣2G}L#p/ǓTV?bmڟ/FG^uϾ^/zB٭_!龀to. .'y5-_/EtC[^⽞~_j? _|q9rB-3h:Wi˟/3O_e{гWs"qp>>H}?8) pg[Dm9O}|KwdžG}><79ހM% ve-îb'p_2#PQZo&džOӟq ma@e, LI:s [&#gn麘ۅ.c ]m ?}Eon715lv̂ڈ?'7lN\;WXIzm6 G`:3d<ս'ˀEګuwt $>=dN(6&z00+ ˴mlќ CN$GN>iuܻ+Gix0F&)[@0p@",ZeyЖ< dcJ`Od$ZAI7)J=pFtds+&0攊'jTEĻ?Tõ!vH8J,R 'A23#q)Z bLTExs]2oӞHd U=q7`^B07jǾݬk7ͺvݬk7ͺv[YW9ݬk7ͺvݬk7ͺvݬk7ͺvݬk7*muf]Ynֵuf]YdmM$kݬA"d 2r~* = JjA/ )=7rOQ~4H@x-/tY|M|iܓm6qOj>Aˬg>zə$ )9H2(Գg{ZDtǎ'7;"zVQsKyH"`SҨ&]\zcMMkt`ەEM\Zۑ. +hLr2ؓ58_ 6(1:ɢ -f=4kez޷ E0Ն?5ܘ+> ӣ71n^C_ik'o> GLVDK?rmq o%h[l{k[[)Qt?8إktw/qoT_R>cﻙũO_:KR( GNCRF,8w()^ CL- jΈ!.P'u9JW(CAWB9<8\s-/p_w|eX J-ѢDe]-/wKps$%m^h9lJa^iGjd^ wE%KJ*lJ6 10OY&̽rW6ކ^/} ʥ3>T[6treừ }c$iw={t9%÷ZkmH.}}fu$% C"ZdY-2EfȬ"ZdVjY-2+o=kWՀ3A:->A:%g`3pRjos3{1{K|#'ǖOㅲtyCS`L2C Tz4e ApYdr 8M[ FaoqGkZ )$kؙ81"XbB Z\`~ZL!4-aգ"%88hO5laf['Lj>b b&bh9lZC E3n 4.Ǹ0<IOV'fpf mLzeJItA 4cҺ`)e$#q0bQG"ʂKM$0+C dGcٙ8[rqxᕀOXO]5bwxu0! x5%K"8!H01 8OۂiZkީ;y-XhrRfTetۨQ1+ NPGNܙ85I x<Mbۍٳկ=cOQykYdzd#ƺ,UL>ygqgZ_u*/A ńZ (]@tQ^_=tPf*YU% (Y4uju7Wp$lQB!9Mԙy9v1n]Vvr,m|9myAZ&X.sГs֡r<7'rٹo>4\Yn5FCή8Aay a Bb'yߨOSg;\^O4OK7(dWΠ*5{a(ԸOn|ohTA/9C&ɧv`o/?yQwͬKoOxCﮔnܿL˭wqx3igWs^,{z!kU[&I.>CodUA;W;=i?vY1bO1W<묡-zuudJq`XKu S˔vbIK&v vh%n =41ʝ[ϖeq1%RG2_C{)5Mǔ0 o,B|,qoY.ql؞[:k%{^\tV1n[L*2~&*Iǁy2nZųNPiaNBtN2z)ZgW^_xj7P EaI{o@/OuM D0KeG' Q2NZ9r9pߢ9On'yr tSb_ln}m9mwitNVtp^P#S$G,Wbyծzȯ& B /lPqx{IJKR'T@'Rk=;4ƚcDs0cBZmP`o{b;8[WBxFc7x , ׄNQ7&9x8@nj(h3?RH0H74SS͂{Xm*g{Sv+Ov4~I`Ny/`qM\Jؿn=φͪݤ >L@/* YzKưҳj[gCjluu|,i6OgOihSt7?Nu+tlΟ KAek7o׻GշW-w͓_Ӂ>U;񰬍;xA`EeW =x&jJ<5.CϨ"Ǡ.E.Z%`9VLw.|:%gH)EВ2oOf}W]ϥ0UN|i;hH<yWY&~DZNw4{G9x LG9)vGꏊx[Ƣc\SX0ta8+ʘu`։u&2A0ɵ r B($S1f4j|sWL#,8ZI}Y; Wf|tdi eRL;--bSAzM--V ?7I,% 6p0EA3XR?p2q.P`:Jǵ?F4ƖD$x"T D8˅!R K!"tKv9|W)??6imÛp޷h"ϑ z5&a~uMzU7&XxM Sp3IC\UX8Лמ~\10m 0>/7~mWLgnZn*=OŹ7JCY_o?Qk6.]zɄs1"* CINSI3!ݿCH;b qKm>CR"jxTq87PNroDuߕ8U4\+%-UnS<9 SuG|;ݥJF(|UN9ťoW12#cLX_RYf&i'NxpKڼt;.P`y/'ۊF&o0ǔ 0yta6IJe{E:FHEbw4SUUV0t6B{AOZ5~en?|~;^P<; My` 2A /R+R[0EEs0;˜ešWҁ՘bYѧ0 -肌nVHCHP0} ICkalpԒQKÒJ}{aIJ,0ŸUXӳ1W D\US7WIJzJ3UXϡ>s=]$%U\s%zq8"Uv'_"xEP ̉@(yewBw4z5c )DʀmcrGc&~&<;#U ^ 6: _:I +pJM0F0CBJ[EL`9ҽDN](p>+b_n0k$1|h'0!`n4ѥ|%8N 1%&J N}{`t]cɖл32xX-):>Lńiy 8 ه3C* b4s"=R tOjre+z Gi5n-K{+fS3LȂ't \%aOL";H ,*(d.s#M4͌9of<ΌǙ83gx|^3/=,Sf)D3hB4SfH PvFgI`%>bO>Iyj gX`p-3g03gx3qf<Όǯ3Մ.8z8x3qf<ΌǙ5C4G43gx3qf<ΌǙ83gx3{gZ7wکe9(sw[y30ۏ93s3s:g:f3sִi9s:gNI(!KS>oҰ C)%/wmIg`GÀq88'Y,pXCG5E*$%GYw!$IC&e鮪Uuu29vĻ$D%FtG.fl)>};E@~ E>-=Mٌjb6:Fuޘ2_${΅Lh3PIh[/`DAL5JBPQ! yesX4^89EFk1r8FE7ș|gǸtR^΄vC{jv繯Ԕ$jo3DeŚMc&DZwR=a$츴69 HhM(DXDSpu B 8N*13mJc4e>b%20Nj4LHc,BK\ՑE eJEКr/bî"IPHB ;\ո-EZIMY3Ȍ^fYn.q}(c4,RP||CZ:AzF {zOӆ2uQNqGVfW10i-y B 'h(P\f BN=TKB=>LQ/a}rscM*]Nfh&NJXo rY`˾j ޲Zni-AS` K _`M"BRi VM|-'rM'big= ϙ%8Ѥ3ve'9qg& ] _s, 3k߫.I{J(${1ljҹcLf(GԠ1SQB9aGT+}h4F%x~=[LUE .AΌ&u brjFn 醓iEg)U_fi"Ď[; ם W*,̾5_|ŭzFqotjCV/z5ɨbjCaxrJK;M[w&ͳY~x{9>~n- F {C̹Odn9GZqYcݙ&7I# ꇑvyf?(u`żjG7cvϻe/T,zɺQޕ3ouB%#y`0/GR'{ď &w^v*oX aqn9; 9uu8Tij+F[Qf~+|RӮ;TiL"K^U!cq't/PHۧ'޽!?'w{w[\ຌ!J$Iu+4݂{C -ǫF3mƸ75ƪUT!1f@/!ƭDZ.b]̖T1H#HCeYV'}c~Rw0]!~6N#X r ڲkC7:4\ )^Q$\hBcNʰDB,h@k 2DʓFlk=w҄{xMy2 !舦΃c&Y +mGnkAOPBUrc^( X&8e0Đci$IpBDр\J`a6"SdwoSF&do"U @+[(y߶7FI4sM쏎!I|)^-lN/b!@+&dhсbv@aVvO( }|(l8c;!-x1& wTǨ&gqAdI$I7q.]ThT2*Q !#zj1!15aZiS1rVcD- 7hq`h1Gį|xk5\l{e5Sz5b>ĴQ܅3bF, b|F4XZIip{*HБ6bOϽVPę)zFsKc9U'(o!Ao$4m~2JvKRFʫŅ@hy׹ׯftd2ƫ=*uϭItG+:Y]Jplb[Sn]e{ƫ;]繖!UO;unyGKcT>7O'_]va[/I/4c1 vG=Ps[k?mδq|p՟YnF})N8E#\ae!DӠekPK].&gq'Ӆ^fx {/@ !k,v>(f,NفGz==i=v}>h -2 Ÿj_|S]Yd^t7VI@?qU`tY+7&oq'qX8[,vRjwk=Eٛ5qrrYV2Hf+`>`ޱ[ 'V(Y(Vx}2Q퍩ݽvVmM_M`u> M,FӘV퓖o5fq1w[.,d5/,%h4_21#³0>揸Wnpl؎ߴbW2Aӆ;&iV\LZ1nkL:6qiTh&=qtUffhkS5nx|Yi@905jOv' 9N"F1W MAq >gP:2>Z o6'7s`HhSxcbCݶ/]>a몽c) m+& ;╙`nxe&WC)^UtߋWf*5iW9q`L.?u{_k7SX^J+~H*L2VB*|V]u4\r0*kɡ+V*SI[uՕ\1r@*L>c0Skž+R)V]uel3`62v}WH%+U&XڃAWH{T|>J.ӻFyso*u0j}u0*/ut6]zJ?/=2~4\tӇ=emrW(WS+ۄcn :p93Uv!! q!s"K^@n P` yp.Cw"zOjE$Jdw~)T|.nSyV?L>MW,+ʴ" -R-ٻ޶$WGꮮClL L`Z暢߷P((%Q6 sȾ|U]Z)_ _`>)2ԡ', yZ͑i%23QRўǞ)lg;*(x>{n&&y<7i MN ii-݉ZF~EUj|݇@R&WuO_ pM.~j9>Qy\:"ڢ=uZLK6hB,uBw\"?vtp(?f|VGzvjwC%L2X̬2erN()uOIX,A" JƵMa"CPIx$VtI dU*1 D05ni ƭk;{_h|rGeyA{OL:ϟNKĤ@Ȯfݫ Bw%&=,2Q1o1 N މd Q ,EcWC%r;V@C]@n`]zp-8}&6h94<[֥Y_ˀʮʁ^4~GKV HYySO,3Gw&r{ֺG?{6ߏ>[׹g(AgUX, _n[Anݔ*{{APKڳuZRԝ RiR蘜hzscҥs:"w'9it!t:Ȏ$]3BTk 墫Kz 8+A}u딃8aS!/IE1t)RcQ١h彳g/qnIOЀ:<G5$`4}Az۶6LPXhgJm.x_Xk78: "v1DD$p.Tdؤg;C^Ʊ%)S.e65QaC遢ZR:$eQ `& DQz\*/ UZDl6oJ}:P.΅_p,W+y&`~6;Et^4=q[8]]:лA}\"9Ӯi'.盯o殺A4tZ&g[^^)ם -{|.NK.NzQSKDPDJUl7X6 /s_t˄&xbB٢II ,^ p$Jњd58]VVA Tk D$MFҊQ(MD8ſkY Do,(c1.m5YN{åW})ߌN^zQsZ\¢w493p#mRp`2'5cn_)V}SRYLM%q\b58xhm.>B]),e# ֒t QJ;W0<ȏeB8hM'tpN>OeV͟Ѕh8~RySy՜ T)Mlg[z@OMITDsA񢍖pD +ؐل`)͌23e@5+3!b.[dCha\~;cI! ҐF%}1V*mTF]FZE.R jXT#['vawEԽVՋpuLAAT!I܃rS߱N?lJA }'/'5nF;pN$kuKPP@1p{/ >p=bW(>3Ԛǭ>BtγEFJ.TaIj!ʇxZ#$Ev ]ڞm -sR6V(8R|$լ 5F *Lfl2to2uzgǬ_g搭Nt:)ە7>y魼5}=5/2D5IRM0reU$*D,JQ&"MG1 $ ٚ))V:.یs*y__2ZK&p4G t.dmEz e I|D ʂdB&flBhGd1^5fِz `PN"TQ&/1ZT*I*]&^dL&YS` H<@K@N?3W!80Q. T J Y;#ELl̺O;:@fP+MGDxwNg#mډ&D*KEQN'2ET!d:&cBӋ?ӌ&ʟa%[,>o'|?#2*FV bD@KH ]eC4"HIGvqNcz/׌De>h|:2G',ԉ,dwE0-QD F b.hQh 0W(d[PC}NЋIU)G-]K7\ƽ }ިa(,dLIFToD`KGȄ"\dĥ)>Fus Ix%Z4>7r"k(XC`V%u`Nb߳n\WzSa~kz) n5YSx$9%@ }G'(@(L&tG&SG4Ogd*GB&a.2ϼ!YJE4Ki~[?B<1}=uC.ξ|o@*t49JC?4HX9~]"s J5]=^݌go/._x VLFic8.ɻ*K"&Ӿ%ԛwa_kF2bHWtnX?VkY&3z;Xwj|'WcOӻgQA=|F]VN]uR3i#u`T?b GX,E+9aZHcmߎwI%>zxu_UU"Ae{tv~:KN諤kw,02[p<^Ƣk \uaz&82nEH&x61:͚.% #CPI$dEE[ښD儩82P M]da *p~Gjlb{iPy3p<}UA~b Gw&R{Sѯ{gx 8&T i~N낾ӑwIk= gBZxHC9^jcEEb9a\TH#lp)Pw@mکx+|ЖxxV"2eMEd(ldLNUBֈibQ&化x22Wx鳵ZB n 椠!AcTbx~O1 "ܢΛL6+q#pU]ӭͧO]~O׋0#% Kg%S ܔYTBxT*AJr`(bp`&{޾ޝG}kʮ,rIA45nk |{oɾ1Nٰy<9H=(kgd+&fTd9&~b#wLvO|y~j|ܱG#r. Щ mF[@gbP΀AKiWkY[EF2vM1ZR:$1@5ղX N ;R-5V#6%twMB Zܡ=B& n'O]3ho"~-dA1 bZo4!csit-oΘlM?n Cj)fD _+{Q2,9!czrɒ3yDE bbF": dM' DAkmH__@FÀXdqHp ؽ5~TK)Rö~Ç(CRSb[4{jzꪮ 2 Hz$-L%L1'8ÁHT i'0,F΁cBs;}{%-qAė>lwضB q|Z9o2 qAZPuL$SH/3A 4R1:Oۣ k,*R**k\8VgMO&0^͘ #91Rb8x4յ+_߶vbۯru!OCp̳&LjYBsF<J .qv~; ]Un}n=WRy˪T9K9ZWM6U@6soqX>N}Vppk7E9TW|lw-Y só!W'*,;~2C҇Zf'}D}I- ~?É,dmK,ӋA_N9U˝WMK gſ2b3;f\x}dc}{OхsÎKY Z <2IeUv@j2Ϙ[dLl q̣[NռS*LD tRc݆@ya(jRw''b߀\wHQ+JpQ:2M ,/hEz,o7*O7*C_]C{cEG'BGJ*%b] qW vLFq|J(!Dt|;(E}Fph1O W #ʢY. c{H1]A^BX.1=PVoZ)i6BO0Mq\6PGhcCo"zy@׫~> ;Mo|uva nhD KHqZ1Zj`53N4RRi(;B1 ` ZwcQs;]^U `*@hTFb/E%y bׁ:墫p%!:2D!Y ("LDH<a U[k4O'N{;m:x/rVz Ų,|z'z bؿfn4nF Z:j1;oV\ E\7ٳm(ڤ zD^DQ$'}pڄL/cZ+υc h E.%eKuI UI$te`$PO1&$?L+Mx*VZ66;i/cϾ^{OVs*è+賨5/sf 1]y_x}x3nY¥G?h|$?6??o2ȥ4j=.@Өk0Eh)S#e.=5`+g]{@ W z޼c7-@~lP4]팥6 ] }z_E^-w ^{ouB?ok oW~i /jy7wF;x8+߷x?؛W@P}Y;ػ+Roǣǽ? Ԛ8spp>smg m&m\⩉q RrnV{$ aQp!vkL#reS#<~>YV+Q`kw:wt0GkCPKZ˻Mĕ6 -*JD0 Z2\s;Bl`:*]@MD[FTP$@&gIO 4(7qɍTyĄ҉/p5nc+7Ce1>yEѕk^T౨Vn/kSPmPD*]*4 R+! чL@RX% wTF$`)cb┡:&YtMJirG.*_3F)'х8㱺Pօ£:j:[[U66M¢l4n0}'WlR]33βFJ%Q3uBO J ڔ6-/z5a.x#@wa2) eC%ǔjA s%$e*cHƬ(t˩)8IiOg) Gi2 UTs0!ܧSs`ulTIV%혒%B(ȭUh|WVD;M`RZ<1fDMzQƠϭe ȹ/%$.1;x -cč (R<:r%1V𞄜 =uB* rb@,X(HC4^k-'$ pփ˘fB 4hp7>s0г)@Δ%kDOHyG: ڔ@mc浯Ҵatdy)6+gSAk/OjC,L(HSd1n&(P%r,XjsMb *ghZW1"F<`d *J͵ y,.TTIkƙ"# tԄ%,3%\4dQ'rV嬕?^-[W`Xd kD_;.P.%!1P*h#*$oQ2~ʜN @IZ'9C!Ie. e\Ȣ4$he5-ii^V1sbQN脡 F^YHb@9uLqE@rx)R0@(8ΪQ1xTB),Z2t5ߓ ~|PD$DYKej SҚhA/RQ\Di-EM6w1y·<pf4٠ 6yq%P I9sM**i$)HRţxY}J/,[ &ٰ7N!M;dջ{kr%onzLvig&J $r2!&(-œMV9+ Ɨ-Qt55Nq\6Ubh35}GM@YOx!kZ!E7#!(X z(OVG'X'֔{) \&&2F"Npolg<T&Fi%QF9S}ˎ`F7CUє`Q<!4MLt&W'=NBM&؏ DX\P@NF ]ܒ+8Gv&񃻋 l(N g`=u)oR l'W0V;\~uE0f)UW#!\oHݺV,V Wo^Nono5]Np(rm&Zm V\cƿCs~{Lѽf_pW~h/e6^|:8B?L|m_۵@ٴmiww;)}'!66wfYa8ju`^lGr'Nλ2ˇk׳rfNrI+,e$X"hԳ KzA{yIka~?4N;-{?>eڟ|rsP.A_MQfaWz޵u$ٿBf܏j"c{'d``b3ZÉg}O^$bI${~TFjG ɫkGMu_Mޛ=1Ho߽y7߷߾ݷo޾F޽wo'q&Anm_ kãox4h d,)fz<u g{~` v5ּJ˭wI)w?c` Nڋѭw~P3_=tE}=̦P 3dNY Ogl3/Y1s^lAle/l]EK5[5BH?;_yig xl)o]/&?_d_,ivN;5<&~~259WÆR._p̃2MvZo]Έ޼1o.7Թ3G[ѪYI/9c#>lra?-&OOƂ,:G\O[* G=ڔ )La2䫬ZԦGe'YЫLɪ2 iEL E^NNx4gor~?!EQ4YZcW4 (]kO&\u?p򣟀(݄0\Ea^z':\VO%\u# WePpń+a7Jyխ{8\.?֭Ђg4nX;89;Eݳ{ޣqqgztx>D{gs\M;cS>9ܫ~Ś '?^ze 8(o\A9kgh)'LGiOW ibe- $ʫ sqfq5?͉m2$5 1cdo.KRѐ4ZҪvbKd]L.rV 9e--d0:$?/ G ﻄv~mo`NϿ[|YDn)\4H%nHT%U )^ؒ`A=w|I1jU kPlE!6cJbM/&SUq1/wZـkwJW!5 5zPkA/T2:#h QoڶcK1|2n-L+6Ku伿{RkdS1+ k LUSJ0&ZK/]8`F76c7c6.66eZI`t)SVUB)F Ѹh g@VZK/mC2JTG7o}?,%j ChtJ ]0ok x'rEGcT5-#y6; oCD"5 \c.8@w ڻC=>؉^l/1k٢-;C'Dɐ";IBvk!g:n7!>FΔcmY#ΩT.ml՜CkAhy Y]eidx8F]Uh%xLc\,Ɠvڎ QM| JnY_B #7F|^\,` 0e dGoUC7_*KI!oDJڨ/*!z[;&X E)V XT`t[ xƩsXRs3aCAY?(EJ0X|]rıD6VS6b (Ѡ3.0*ա9udZ=SIYld*A#)ٰU0 X,Yؗ]b-`P}-hAEX] ~L vBVܸXe+S5!bʚe # 2dB;9'I1)v\Fok(`B]j) ^1AwIͩ 0!W0c2oY! `Ǹvc%XY70v r@}L|Ρ'8()8Pq>B% a? ` Ɇ pv]BPHL9+Fpa,<8/W4eIy/EzdoH_uW*7g%b .^vʺ@J2"z_\\=ZByJ ?՛J/%[Ԡ Ed8$5cȲָH B]{4l=rPJŹD:Sp>9C@b_B1k,8_W!*S^N0BL#U8̀-yNOu0~<ՂKVЬ+#߳my  n$ ja$aAƫC G\q Yd(yP]i-Z.]U}#!'a!hT ~AԒ"4DiyM(|$3]K}4n=E""A|DPVZ76dn+^ ,TɂNU~d}*yS|U%jLe5Dw <} [pg?/a d*K&a)V!i,{ إ19nbTmD| Zˇ$ @1´d7P PޘQ{X1z, 6#-h'R;@9@ /B9vR'mxT clj14^HD xdI.`}ӌ;2S"iU(M.y㠈ȱ)p}X(#0 1tD)ȡX S+_`u!ղ0g&*Hh(;֍!mJj[Qq/o XPUVP¢l B}Yg *Z6]\]oGWw1r%{ dcHCv!CRTˢrb4{jz]* +upyI8ѾV<߫jid2Weʜ@8H00ԥ%N`$4`[}5l &,+@R`nKfSj=gԔ%o9TVl4X`}nn,@ʛ YE͌:YVjr |7`Jp k2y-5ph j>Dk^)sy%;ԁC* kZZc Oa, ;zY+/A*0>?9' %ں #.^GN *hhDe7Um!MU tV 8EB~;X?yί-~ĘOK\ r/s rykRauWD@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H׋2K@@6/ 䚗xn'2\l<*/s/R"0"wQѺH'>\2ĽObE = "G`ܘplܞƜNY=2W(z'Ce } >("`YK"% 2Ɨ0H&":2(EDioT9Xacެ"2%Y mrJ4r+>J+I {c1VSb;X Ms7oT퐪/mvW'azݾ̓Kippo>irmeg,ŲXvbY,;eg,ŲXvbY,;eg,ŲXvbY,;eg,ŲXvbY,;eg,ŲXvbY,;eg,ŲXvξ޲ %-^n|#fX#d`WV_%=YX'^7ɷ|.76+o]8\OYHF-ǫq I".u96h}> FQsfAyϕ% LHԁQSICC>7u[ms}乓Oc`^qc֎NY}X0*m #B>`bqƳژgCq6#5,ۿ<ۿj:I|#(}4 O&J@eXzLDC\He防"BQ'!bJ[Ƣ`pːا&i 0aRXX-TnFq0qK?3D3(C-Lwe8e)U 1 `?ڶ[%Xٿ,~k%ћ7糫_Ur6F͙?hr1G -6߀WTs~7L|x3m ._]7ζ#E"~hxq9_[^v-P[mͬ-Ҏ;BzH~۰H{0bI+`*fM>>/Ysx5z8z.}ϚᦫgmS`m2yW9=}ʕk_JvZ ~8u Rϱ,$㐫??25La3ۤkJ`2x~yח$?_6O)3?} ?{E꬗h~|>{?<ֲԭ~[s#u]?྆|mVB+B }qᶢUF>]kb*EwQO(7DrGɩ [ׄ36$ap9$yZDUΤ&xŗx\t]A% fٙ`_LL_JL*):Ά/*rUomu;wW 3&)x2DJ/S9l(akS0|HY;Sg;0l{anXv;*[=lۊ?.]/. !f"\D Xd!ϓ 8GH6.%*`ꊑ߫8Rr><74 jr@vܺKOQGG8kr>I6?!iZZD+D-F~ j":s4 txqۇ`؞a ݙ >{3YkJ%q~˚:?t/H!,96_wg/뺖E9DzeA.F~jg}|<+`W^,n؍#)Zl1TuhKT]T]j99k9ӯƪk4aq8ck@*e,EEUT2PNaW'h͸D7+*wo~T&Y_l-e0t賱4]7hnz) d|\cG%Ccv8 x ҥ bnz;waFM[e'w:Q/bV6.2̚,CA9I)Q|<2&gηWss|ۀ/dtgkY F&3C?EpnƷ=3O; >y|y:|ș[8sóFgf,d/Ľ|W1lCW U>RKC@DҶ%d\n)BL8C)wJRn#nvCOH"i $uc\MybXvЩ#udmaظ31O 38 |bxX FT=pv3\fMLN~q:nNSTZS֙ Z_Z_-th^V Ȁ>eo"؝GW: ޵]vޛy5qb浑C. ?n'an=wU V'>\{Xۚ_ _Agua>F1m5 }g c\/dW3ũ0ůyH'DBE. 2EUT OIWWR9 [\<=׿=d]X z,>KMۨ\Ym\xx:%*E⹩J6,{9\o;XF6ϗq(RhwmF}r8{h[8(!$;l~CvdDzd^[d<|p'~*sy^?ɜ˼\z2j9koOnϧgNW?k*{X_c'ޟzNz{fÍ9. ݚdn~z;zxPOF{~8o6≌Kxrj2وޛJC U{rmwch֟ zZWbvDk::/N1G8\q.K#ҷu;pq3+?ҽRJ[ /MlFڻ'% ybJ̋7 ;~ͧFh?3tXMrCZ#~FaʺSI~YrJ+au޶T!Uo?ْ-xv\8Lrce'8Y2<՜kJi =!kHcɾVTyZ[U@72v3g=2RvοqĆ|"j19g W|XU삐Eo*0/fP&%0`8ba&y_)*`(\R]dezNgy)Em4`{@U"'o5J*M$l A5N8<Sm仅bFbA/I16ksrFHLR\i23vH{؊icvAqGAl}Q#^}(ZFK[MkCT@!M`K `v=V5YeVN1J5+LQW0S%Lf]6ibwe9[HGm&:W95VXKó_k'Ǒt=;TX\ ɼ?)"2'=HRdEPF!A& Iҷѯ`K4fۥ9Q7f n7Бs*Ƚ;Wu 4ޠSN#BP nW4;P>6c/f IjͯOyGyOq׋*Cq,gly+ Ong?ori\g?|?WT`++A-5(s˜2bY{Xsg9׬?dzKb")j>u=\5+W/u]5pU3L2ج_joVqہ _zvzda-?J3vWaz Wv|va?sҰ}<]:B: EqW9*CmoϏF3Ye5grx \ 3i%P2й@E,UKsbw PB%zTľcnChL?./>x%߿ۨ6w2@eDB@1kr Ub(3\ r?.UVC\?/zPSewhv}i!—,,}㖒qvw;ݎnbHx&#g5"4su|Z[kvҬ0ĵ29&`l0 m6cJ dJcvvR(˪mB6 GgwԁSZk4u@VaH6X5DŏT+pU׏t<+dC;ar BAb`\$kΚ20k<(J'kfO+9D" b+*mEԕ۩ْCr̞eZ[q|p6<՜kJ) =E[5$ȱd_+F*i<-* g);nX,tЎXXxDneb>r݃xON>οqĆX"j19g W|XضC` -xSQ12);L3RtOPC*|J/s#v:;K徠v7x,jSgԦG{oO+V(rVd 5Jͦ`M.48 F1jbII2Ҁq_<}gD#"̯Ţѡ:&Bm$-sL̵5"I#=VWz~1>:ž (fg9D]8R=% 慺lTf_"[qڱ DlGB2q`^B2B2~)[w^1yRv r´Z+tJ*`Tb>¾',NZ]ٲZ랩 1yH&71hk V 1o-XHڷ(D>.%wjrSB*>BŖAńё4^fr нQgr\f12'jKehVmޢZ&48ͧ6)^K*U\Lu"*ŢV<x?yn~of IU)1Q2d8)lS] s:Gv<a<G"rf"뿷^π7ZF2ȼ&@*XlIl!yhd$]p~+"bŴ>rfD7ylAjɲAxDC˅β j[J2Fhceg}Ucx k2W{*yctA/fmbxZn#OOT7Z.dr&;T\KW*QDP dGbEt&1L\tf ~:Sx: < (,Kw% m"2}sNz9#ӐȄpUr{Rܲq9:zuANj "zszR:ŪK7ʐUR4HD$ky^ʛR4IM1E}2 @LA*cń^lI2/VtvfmLZA cWiíڬJm*\umoMajHIH^њJ"o'^ 5U YSJ#Vr5&U@(n}08w(qܣrZl5R9W|l31 cъ {RY~PJH>y=s0}f*wϧ7Ѫ +A{׶RX)?Le$ú}!Ixw/z1nqRg׻?]wRե]vsnu ݴ%&}Oҁ9n'ag L&Z1Txz㺷 $߿}Ï_>b>?#,?`]F% V${/$ރo[tsu Mmҵ9ɄoЯ9?UZ< Q`Cf OWG>>X쀎Ҭu2wVZe|Q(?G9 AJ6ŹFIR(=T`SS`xp҃`5(@u 9T+w6*R.v\TB6x!) \}x S\'R+B0}\0]77+UU_&YQE)3ZT.CM1Y"ʦBZ6ct2[dݞ9 lJa^iGjd~sVkjwE%K0Uv뺩mSʘ]-.oס w=x:)_:Jآugy;r<6Lrị}ҏSݿ[,DslݹWCv6`t:s78Tku[=6Vf[,XZ3,gi=Y˥V_9+X0J`D$4bs;^Vn.tϤ6"(.h4^&ǒE*L 0E4^J 2Nh6d>F)Љ)ϘpЇ/mT`3ܬp|KX3^ 3_10E\9)79YCd΀O4裧$tGZ;x5Xz/D $P%BNdKxp ,JdVr 8MV J&eB( -J))$16"hꉆ>kqjk1д5T-όvȀ [Vzlj~k|H&f* Ψ>,>P\ϥ>T|}D%g>+E,G|c5LhC}97!`Č"+1 XrS|0XVooe0[1r}Z/A%ҌI낥|Dg *N0g #z!Uxd!R,HԔ2A #(H8v3{0d``B|ep4.o/"v3 ml܍mK3މrM Ȱ`Fn#Bq i C(''"uKr&J@-2*<;y-XhzSfTetۨQ1,%! P:Y/K ǃ~,gV^a[}S4`+rv0=˻<S|;(=HQk9Pބ%#ظ%Zxs]}8Mx NAv4m,m'L;$\a(%K9]VIrݷYeIOZ/CfPsokD Z?nm(fM:x "!]$sd,y<\𬋄3 JW, /?ݘZXz+ǶuDm6v=̴͉qI'}<|-wJZnhb[.,d5',4ᯘ!pIlWkevN`e\"h$Mצ!W-Q1Ay$*e8fId`{a8+杼fakWk8.rƳIi9@93+jXn'N2 u,rzͦU'\) oCKB%՚Q=H&S2+ٹԃj1"^2QU Ar)g\9g\+t Bu ՕTnuYάfgX0QccN0aӏU[Ƣc\WWz@4v!,  Ƙchpۋ[|6. cn"" |߸'Ç%um ?uoo )/M PȏUox LqYC*Ncʠ^ѹwRx/Ŝkj/9vR1|%gS4aE>"?R+UR[1EU:[6;˜C%h~FP!|6M"Yw/ŜGTZ`^G#3g`L8ȹD8DE]Bu΋`٨D<uӇ;KTꢮ^ |N茉`.F]%r%:uUP)-ꥨ+?-=FH3<#oG.;?ڎZQvT hsu5n[.K)qM5741O:߿?Eͥ@DJOGU |T*Q>Ge9mӺoaȬGOGU |T)08>G) |T*EV |T*YG">GU\*Q>GU |T |T*Q>GU |T gT: 0XΥLH@r>=Ty5`e >`MpJВ/TC&u6ǘB훬.k3Vvc RF$^i28R <Kn}=09)`C`9+ZRdPG ]+.}Ԗx ysf"(ܚ195j+(0g Ef]ȋ.\H☁ !fE;$$N GQ>4spМYf#g>liqOE#f]5ʬUшE#}!jd^ !Ɓ3uS/IXpU(4(bH\JDȬ^#.&:l\^(^E/x*kwqWgqzfJxߑٻ6r$W2QH\g{"a<pHHlSl&NJX)* TfD3nh1h}tEAe&\(tɅ&4Y%&AS$?]YI_zO2"[k̅G\D'1s c"y3 YD!!TQ)&&h$IZ^8T1R f L]4tFΆ+>xү[J05K{Qʺh3xb"\pΛ#7:ܣ}&T)fE-."/t5 1$:ФFPKerkժ$l oĀy;+e@F*y{AܷcP̸G4yLXծu<20J(),dgEW E;W"| Q=gssmMͥ*K[8)YPf#- 5jݑHO2񎜈9ԭ =&W.lA]ܒѠbsQ(,f* ,xjͰz {g#L {ǥmz'|8sW X\uU &_ ThJIQ>ҧzSiHof6].+並喼"<%1Y_Zcc:H!SqpdR lF'qtq0;c%xr@#,-:#( sVI>ʐ&hjut7ʬ qXg|)͇: ~ihbxe\uձ5_oRF]<^Ԧ95^uJڜwW7#ٺ1K{wWhĜq k^q1;?k4;h# f$#֏N j0W= bQuN>. =sx2J~ hPh)B.E(sE+"km&FmN딥*ViviZs{^/114o=)8AMk_HVYyq}A^5'/²#Y}ޣ q=xg}H֘]Pթڠ!Mw>i'1LJh5[bVrx)j0B}oT[7*;hTˤY5{$2CE4J`l%2x @OFyUԮ3]/Z{=6V72W􅵲 CpGZ sEDH&m?vn~GQz6*NQ$sDNEQN,'2ET!dG'c(;WE\EyDRF^^A/{(.92BwGj &IK[wszoZu">Xu6OY$0Y5\[9VAжDy%X $FEIZѝx^5C+cPߣ|V7F+-M{K, 2Z QO }ɎMP|kW?PѨH Ț1RGew[fj,"W!"#X+ED 29mT 4NZ#5|ŢL-112(xt*""C)椠!AȮ3rFv}Ey绑C5=7>W_FJQ6dI5?VpIT, *6=P #s)-SJۉ.+i MkLq_[W*K7Z, uj} Ջ0'3"X^ d+&bTd9&[1O8ωLگāGؓ?'#Щ mF[@gbP΀AKiWk"sb;N1ZR:$1@յۃe%Sc1:H2JKbXJ3rƈkzo@50bV xv5_:a"{8_z5obb''!0.;x1dܜ<,VM^6loOH6+'2ݗ^'sG8‡P9[#%g6& t Dt+;U6KuA*JzlB0)3(8EeapLٰ4ԜN'_/^ t: >&vQq ]o'=]32hef Xt,`2 A LQIYfƓ`lcF'{;. y)5Hapu .IBlZbrrIدyXY@YYb~qMZe˒r[d8:ylCBp4oԖ870 A4-[^ڷ ]i}IʈCKqX$hĜluYkep:(`76k ʥdkåmGyBsEzf[Mgsɻ[ˮJ o}anL$}i*G ZkO+ N3(JR NWj{J}4J'>Yd:v62&v8RtD, o'sԁw]BlHbi0 D}S@CLђ믪mzuK(9ȠQR:堭1N2bT3L"5z!V;nV\KN?m{զ0{X_*,XCf\/Uv5*wsϥO Un5aBGc}VL4jA&(=$"mzϕzϝzϖzM{3T4BScKD9#R :`]&EmXkdIdM-OiL3ŵSw͌9􁄦zF>ܵz?1"s,2ݽ gGЇ'hvr>ˇM74.cfs M(4كbsG?}C8-*] `Vyw3GӍX-o6rL?#"g_9i(G*b$Ee(1xjZ\]osԉuV?M~ ʔڦnhRR&Y zU9B 1h2*$zDiX(OSLLƂ9ZqfglPhښ2؏q<{Tɰ3>yT% M/N'u*~y=_[[/. IP`2[ {SAMj}2Rd"C&, A-rrTJ=6r3ΖKF&rQXGȁ+l$KCa%յf쌜5(e'3x.4B'Յ+ΫwM ẴYZdqݽ0MW!1PJ[H5T̐ +T9j4㷤#nN*PAAAUd0̖ǵ=|ޔDE$>:銜5vro*wڶcm{#85ɈXsB μcLd"lzrҘ!E02t66DJd+:)k2db'ub.:F51yXFcU;jDױFtF5}wemXXd*LL4$)RCR56Q7KHgjlY"F[rO_I5ʛ.a;cb>_#xi;tt$tdr`(X~ nǏͼ* B3_ j$"-,P4 YjQА::*|x dM^n] ƺjUgGvP1GX1wVSA#ArMR\5MAq5RЀwq>49E!y]2^|աe[3mu芿 VmB^4jNͬRֲhjR#gL 6Õo8D߉;#~쑴lƼ{Om؏:$cNb) % YZ{sxr\;|@D'2Gea*e&|c8Ƥ}sDǘ,LPk#9ПX;79{{'z2ęƩٗ~ƍ- ؐMٝuo_-}eKTgSOYM >d~hM(.' dU9#r2dtȴ$ي8V q, In5mgկO;9;#4Ɔ^0 8#@U9ꤌ4'eV4>\U<^!v## t)d hcR(ljP2de9FΞr@ZFocT9di H|ReP0bu4Geft^\xAB'5EkCOI-rG^DrQ;!O")9 Q Iܲ眐ԪtpJt\vU9a{e+ +Yb$'(r^h*ɳ1'|ZX c8( NNVm5xԭZY|O^@!e'r 4r &lrD3 0\)Ջmc/z'SiC]/2c鼩LdzrAH$7Іyp8-sYG>a? gw[E2%ٿ,[NFpI; l:{%{4%8X>&[qQ?BsһLѽr*B߶>bl 1 -1KJw+"bO ?\n6@_r2b2Jn\fY~&AL8:vbfѓ5鶇z㪂m_uVRXyrG2RV>czB+{%[=~\jvZ)t0}ןfqᕓ[]BBK_Ɨy[_] UmyTM#{uAB߿?Çw c) d$ biUkiZoҮS7|uuӷo9^VXߗ3OĽ>SeoQ Z1  ֧l"3>gD4d&flrh9JqXh):@N0Nơ [`2 KNS.L&%t NP'^ɕ{]-.8)R=c$qp%Jry9"$%ifɍ YJ2o>mtKZXdX<1<1zb?{'Wo{3-r7Wg MpI,ごGB4+y΁-CDߌϿ\ ~|eY-hfOyq|o ]6"Ϧ|RC"rw֍ElX㥽Cqͧᅤ ya2m RFs& J]箛q=Z%L S@X, EBB2 [T3hel">%{Ncض,Œ s DR.8G+R脷=Wp4mP|C=!;b2xwP읒kms9RE >Kߤd恕ӰAOFuuWJ:Cj@y16̛hKFbqNe}=rrꑴ]ܤa^0o j!M"PAeOKZZ v2U1T )y$/CJ C* yZaa2ƈٍIC7w4ɠE'T$,-'Mkl>oy~|K|[ϠCA#AL  21sksv /Gfz'F o@qx2O &IzuIV'tN83 >Rʦ Z0!b@WjoxU)@i#*fm =e2NЙ(ƒ@o%rU3q|lƣ/7|>?͉x64lu;tg˘=Y @>'=U32('ɰIF@V! zAcHHg!<1@>fsGU6xq+%%xNEoy6 ^#Z9)GVh5rvKZ x<Mg/Vm{|֣WtE C Ik!Z&O){ܾU7>>I-|W a{~ܼ̄bvcuLLY;~$H@i]G ̊.䦓%~ }ِiJӄpXد*M~Vhi湦(Oϓse;\#`vyj7{/ Ƥw>cJN)y- -|v杺PiX"j}=}ޙeQm7=~b]A"[~1͞SKO?~~ttp[6p!q]<w!BdV{)4nuܗ}V?rg)~\\*3'h{L#`4ўlwYznzt5m'NҝIJ,AP\v}7D`92ԋ2x_.|soWݶf\#l1Plٻ6$W[ #>pApEl`tWWZӔ JxW=CR-IĖ8ÙꧪLԙ:9۵^.<+Zҧ`컥~B5??uoes+"EhZZl~؜k&"6ye0 V( :7,rk._x=!qx3蓿/>_i_'a{K2?<^ĵy:_iiI\o.z $Y(";Hb|>}n|7\~y>(_^>:N-?wșy1,T"t{wo'G͎<=o0Q󣣫0}z2;臟~ vb:*GtʌlF/ QKk:_Tһ W{jOnaO /Y/5r`Zs EFZzK~ j4gt|jT60kδ<ͅl=Z[~o[~o[~oO2Ƹhod~o[~AzSp+ H`𑊸 H̱󑊔7>+#YiƇ;}ihYlҡNiλ@hc`t,aжEmc4ZSXṠ)e hhљR6F)xF1#@,-܁yrwo$V.--<Ӷ  7f=)/{ܷX^ af0iE/`XY![Ⱥ# e//zuU u.osnM7=sT+5/OL< Gw*XxKV<+g(tzT~ 0.M΋[Y\˗Ƞ/M7hS~[gd6eoL9j}MW=c-^}ĞXͿޫF8G4'}Ŷ?O+FߦA}wVW?M>N.?O05<î-*اH}% R^Ԇoi=:@A*r1T>9ʠGs^ˑQ9x\N]"ѢkTktUG:(CR a*(O*!t\,Sqʜz&ZqZ_?'@鲒Bн~>+5bݽǂfö;s*U_}څ1 ^wj<*duPM' ,v1DѱLG,*ps>[S!3S(LxkE"t`$6AXkTA۫Zjͅ:ZDz&L!GsEX&sTXB>ң>\z&@yx Ó`>>+ܸ5a 㾮ŀ~i24$w]b:tޅ,s7n7]L`q.{ .%V^Mۅd~}MC=&KNn-h׌II1UuHVT ,HրYR@Rǁ1"Uhd-KWzܐQd姄An*X!t"S9M%f>)Jc(QMONG@?IUw5ފ`8K\RF )B4CeR8oOU=q6cd`KYrAl׶)(RrVb2,sB&'U3Vgg|Vӌ]} u{Żzs[2k*A l>2V8 Wxa-ΗEJی2EA4VJ=ho,>E.8 CBL((nS FELs> Qy3j/gǎ\׮zm[k}L1נIJA̧ l83gaʋB- CU\QLȐ+A%_;S.%!)F?&fB1GzDW#G|L!JoZd9^ 9[cM,A{5xUVl,*>Qn%:2XlD#:\g5-/.)}/ybCPu7Nba+=o WdU5b<ևs *\2Ð)?~'1H>u\I_e e=ٲ;cnN#HE #KJs(d&5sC_k%Q;': ,Z8B2@8}D=c^+lQP;VM-1CfiQ-/$_~W@mK_ fsn%_\60ˇ/T&ŅZ dQ2AEi4>AH>ۀ̸@IcrDltrYn7ZYp4ֺ!))H?X4:$mU:6U,R8)2Ae"+Ck Seh1% ^Vjlig]0ycG[ڊ'sz+ˀ>:yY1`5'*H6f5Mߊ~^88f FDrmx!5O"ig8ƤKT8ҲAjP nǤIoǽSFWvCDEe%& ;AiHPK%@N<0F8 9*WӉgK= oF[eD )Ie^4/{# E:HKK>:hV+J5rTeDHkJzzGj| laެ Q#h4 9H#AHi.Ey6h4ˠY3zcGjuטx=݆NS9f50 ˱+UDY,e@P愄F69لג3 + EGsl`$ & RpRMP{n9%z~./@79xDĠv#c<1@)M"ȸ\۲Z.Q!*Ltȅ`I*KdtT .(FPM!dejlYHjǔjvAr&'%<ɀ`|E:bѥ̣ kJMc2X#HB7_u@ZqҨmYjAY'lqTvb4(!C@#uG!SW(RsrԑXxnuH4H&9"9ȿ D]t s^2p1\ ?2/55mW$(h@C"Y$!;,\㰔qτ8AбR3K^Wʀ6+%t,D BuR1f{2R/ޏ_'}Ha X]j :|X\%'m2;۪e@XxmOLq uJӤBR5h5IMe]zUrnCte ؿT >v~CvKyRw1unl)H&XB^LWJWǟn?*ǽK(\r:C[ֆ[׹rnJν=MFw-~/_O>xh=3^-7s.!=\ ؿBjV~%nV,dRF!3&{wSDkv?uǫ6'e^uɺUʙrJkH^X m0yʯ'% 's9V]?8崜~x\tnz9T,A_UQ&~W:Sն;L+?$]3TتDJGnzݛkdӇwo޾~x}?7e}nuW v@fESKzӥM6Mz ռ-׬u?Rh!3΅۸ͯrRJiذGGiV:JN ASNېT%%D1*Aզ0AI:JO}0.Gz3=u8`>QR&r-9Eb1¤(,AMt zfN#;GM|'1\|ҦsvUt?a_AȼBe9-07?g`'wK_8+^U!Jy/{;p]i0 `Ee^_s+ v'? Z`v BLU:.z%J8.g` }Q hKI8c+u^!yCq|6ӭK0v}R}T~TݝjW'ɦ L_i*\J:o~1c*unKNBxgٍU;Ajܡ76f8g[еW}]?:/l*F/Dc(s FD$clB@-K 04Cǐᶯr +Xz2*K -g.2[qOHQӻjۚzx֏o/2U!WD$M<h6,̷R5f|&΁\ h?2Ʋzey֏z?>団08zzDgf٘uWbG8rڳwiWO>7yajz+>],2&4?rP'MWLlTIݫ=]iN?ill:0goRU7to7t79NSԟ.S.3; Xt* Ēf8zW7/MZC)9*-%^s L{3[m|:-+x%]2_| Zi(*H\@D 9 438JIarQ1xΒj=koodA;'=-7=<[Z8bqk=D4kM?eW-_eig2sɳn9X-S<1S,y7^MJ(ڴu^odCڶ yz2UG+I-n UeE S Č D},^q ?ذ=tJ7 =3' $zmrUb6McsQE&{n8fMeh:im :4uW+y`{XQaIgπ/hPI M^r4N&祥J˝HQ@L{潃6!x$lzT1تPݫլtꨃ<dmujP{F8CZFoRO=%gfWsǿÅd TkՎFx:E DK VH̭F*Q|0Rų>.e{s*tS|I' zHit<>/(5z k*M5&M C1R JP\Xdp ^kfE4 lXjz 5HANAˌ MW{ˌUښT2TAI匿qʘZ.Z4m΢ zqj"L?T.pɃ0U(8zMR gмy`rk~W%o̖NƵQFGυV()5hXC#w&@)Q P)1ZZ.qRw`IOPi qϗ9ްX - 4|kf栘9O`cN)Jק w}P.]M'xтN8+Qa{(㰀^_\7%,x# 5/W8XAtuCwpCsHmک[;kI c<0Jjl(~gqdB˦z~EfuvMji^zɹAH#;%L+Wys&!3g,+3H_ #F:*)Q$߂hY F1.{9RDYϢjXhŢuSOmas- r$5::- -=QBH2VчJLpDO e,jQbBPhPpÄz;]o7W{L,Ҁq9sp w_0g=)3#+_?Ͼvz~^VW_/^4Z]@*Y|/^)rMMDf#+>ٻ(sљ9-nFguyu㻣ŇՅWht'/iXr'k{QĬ6h}O7;ӗ(Ė>G6 lFÜf7,)Oz+XWz'cNj&gQA\>zF6WVu^y-#u`׃Y_}4gUF#\/'z8*vї<֘uN]lWO(?25:/+K磸kʈKjSx=,$޼?|?~xN}^{;~ȱJ5 ފ ૡO`hjho=x]YZr[q3'o9^X7?Ӹ1qgj%\j#ݦ=ޡIv `PHB(X cex.h3ijhIBJD }{ڥ]:xu#@ˠƤsYRуi9@] J@c(&;8:ܙr֌k۝\¨|{W" Hg'5|v>J@6& KpW_ h) 5v{+W6 \M6 FuS-B.%#-RjbK쏃mÒal{Y)dEy0ʈ,*ƒjeEcBfDqTXR3x"aI!i|-HLY+2JY|MeFk,H8H[;VZS/zg:z5c 4uk7k 31, `h[PK^,IF[EZ N<6`lz6㐛Jj,[td{j~@o$j: a6y`e&ؤ~E]]wsqBoG~`ȟ*ж<|/|3_yo /Ř/fǷ"X'ЌnGen]ǞފJ: ,=eF(2 9ߩϐ>z}ƳhgYw>/T: Z]$GlvgY.fY{'3:nXiye/F/K_!=E3tT/xѫ+UFވ=%n:&<17ygݐz8;4V(?_=J'=WO rѨ vHĿeӐ02l\"Rfsȧb&d5qD:2lQ'XIFH¢LL )312դKDiEhdSTP-~µ2(jqŞIrh-nnߤyBsR3f" Ld+&bXd\#sR/<'OV WN2nHh$<o %(kָjN 1v՟UnJV bueZ @?!Jk،1"kxE b㺤ؼ.{G^3ܸ]F9FoK5 ALcT!澏xY]'[]gL0e[T$jdZԉHIYC5o~?w~:3K %R.S&XH!栀AW ²VC6dS A%cPو@ 梤 GJ)LUY$Li1lF5΂s???zdn>h6GgN9 ݾ{s5:H T}60t˷-ز]{8KEvEWn:z t䀌|47]$1bNZ$H ] *:G;ހzIO.bx~ jWV˿U۽o<n;/e+ @*s8LNR̙%vѼoth]X:uAKC0ՓO5c=DmpI;1yMXaS!z' .a,ӥHE%NHsκiW9 ޭ{ҙ܍3:<֗Ͻzz;sWjNk{ ' sgEy'.yd?4uLWwEЉ?"pm]c;CQ05XD5"FCdc`b =,l&@.(N҉ws+9%'s^|< 1^!^ 灠2NAфo?_} λ/i:u 4JEdwh@v%֊|y:R\+Kn]aw/܃VԔs (UHч$bPDJT%I_w5vnKtLsHM{Q~m$BX`*24Xr6)R}jhDQ>3 r ҰP.VZReXnVֲt D| G"z(vOx7?ܾ- qѱ;+srÕyrZo+_M-0_g_,qS|FPc\4RD& jBLFc)3ڵm,JkSQXdjJH>'[ zH<)5JflF՚r gTƺPNuᆢ:Qn^u-wE#6㏛jDX#A#2AZH+3 %)8)Ei˘, YJ"Kv$lkWx! !Y*Bz&hEi$lcdK'4ֈȹFC_u`8VµKf\rC(Ec88>)1D&-Ƅ{k M9ESJEZC]]ч͸cGaM;PaƉxy7Qv>DbY]꨸!!QQb&{4+Z=ŕZvc1SiOg/ xy6O'9nɭ.dDkvEwEd_@P h]|LߎUJҲ=;7-7ZGgC*ԓZr>E&_kȬ K[_@,DS嘌nJ06+]68!AY4dS6w3ry:tg3uzyV%t;S+}~NҶ۷$\rXœq-17uS|ѵLD"hKA,y4.eWgag̱cE`#;Hn3VU^rh dɿn|? '{ nL&bsE[Yr$ىgjI%K۶MEXUcD8= tLAE:a9Oҡ(˵ $Jg$5LYB :h’`B`IAL2(gyt8YEW|dV!.D׏AK9KQ X(I`Ē(4hWb?eN'Dߑ@$pQor M$,cT,FQ A(<)EIKNjM .v;բW2je!q1i!/bJc01L.wŏ5ʏymӏa߽DFG$\^x.[KF d/a&ZoPDRQ)設=v1qYFLdk(-񱖉Ƀy8Y0%d.mLP WYC6yq% I9%BiRQI$IA<#0SZÐmL{4^\w 6)&e$~ήKW:UDvk:߸'VMD ‘t5•Nq\6UbxDg:{PEr-`czwj= G! AD4!ԻD9ZY$PR_)Rz.l*R>piAeD.4. 3x%,G!RĶ_Zg,3<%S3n/'+Z/2pMET oCM 71QoP p)Y)wP$p1n0&,+q ֒ a*kp":jkBABe-;ܗcP'0ڵ:')n+]kR"0pJ4F zM`֘,'xm7#C'A tXGf)nBi4 H1t$(2ր!>Ǝ%‹!@'!hq^ c`8 2&.Z$Dt L Y?f}dDcmjaGm.f L*0 9Ic rL sln\7)! Wc\/񾸡Lf8GM}jP)(s(|jUpULW WY״Lrg`x?l l_65ɨb6|4E s~_\ӴuQ7fF2ZyYx~19>8ZΖ#s>;9ז./&UIãS7-n!+FRdHB.l0l0ua-bQyI>&wMN2r$Wj>8GCC#y`(/ȿR-t:Dir*1f8fWPeϦ9/A]ăX#ptVgEvfc 1]=ޠ~])񋣟_>?ǫ^?2s|ף/2+A_u m 5Cs#6ڶ9o0mNyŸ9j,V,Ġ ."m~nxZީ#]&x@Ei2U Mh ^{bIWx>'$Z'BH"IsiVSp i;̄AFEr>m~eVjJ@:£[aN+;;N|#>\|J7&=U0Y͙}JNĽ0>hՂ+ LѲVBH9pV.\Ck]BBݕ:Kim|tZ+J;0f'P7@ :-11VzQPr/ٹJ&cPI3)llaxяPddt b O_?\qA̓~{na04X!dؗF/+v/&S4|Gu_ Zok*/$$e?umS>-/G|-[ȷKC)8^z\(a+c'lC [9o3н{f w^h-}"LN`x-MIB ,'PHaTUQh@`h̊0R5`!LI\"(xYrZ6Hzp|gHE2BmjN7K?TŌ9יtWָӏه?ȈeH!;H qF' J Q )θ o+;޾ڝliWIAK:Z{MY"v7յUM w|[ /?~Im,b)@-&bhсb"wV}G|>[Q[؃đn K ^t 1*c-p(uDP2 HosZ[J$FI*#r))\Pehdr$HJ Sk y?Jږ[gơwFZ\(B]hV8U\%rt(Y_)|"Zעm1H χ!ggA+-Jc!d/>#1Lw>bMjk72"Rq_Vi%ώ(6 "̏ItNaܫ(7;IӲtC~`5sH2,Zr ګ7nSqPRݽzfL)At؇iAZ5_5= @E*E 212Fju;#)i˓A,&T^g:kCʰ~*Ǚ;p:ͽt:qrD(WܛNU ܱiUz[@cfTZgozhoܨ> $3aRͥ~WlCUl1CooZޱY{M[H{lWK+^Bڊ Xq&Ηqu|hoV66bτ-A|[ |kA ZFS˨Zh=[4לۮ{K!vVۺS:s7-, rrƲ4ҕX$〨tDp Q^DwߴMVPnT\Bc9@OvwKzsYvWYfҦďLϥ2RQ2` "N(P0 ) 'ܒh(}#d\.111rbf Fj8Dw.98+p]V2|X3MyyK/(ɇ ,Ó*ށ-WC͕''xC"I'ਐ$( Ua=I\"MYLZd2E;1vp49Yz:< VFhJsфyK5 92 RbpxCQ 2-Q:bْefwY,p bKINݒ,ےG+ΌM5d=~E 9΅sCYrAZbT*krcU,5$0SmgU;.Fч[.MfПLS3^\ $z"$~#p<ͣP6<0IY` (x˨M<.y7nd]0Zyy]0G m_/5 &GM>‰EDNhlPbAj @H;)cPKhu|-kڗw^ @EП1j`=S2J5%GhTbF 2 EZQzcXqrs[Y~Rݷ Zdi]]E}uM/Cfk:\r>Ҍl>W&)趩i}5j M* ձ`e= ]:U.g1.."E[cI^:vk `mU<$ܢә*yR^z0LZzz=YnxbE~[c -|*篣&o *zO(sB HzUA>{|SGR99Ҹ*!Dwr>l@"Hi}xx3X)RFYMFחya6s[TF'^GZ$9NHu>w?Z(U@NM^VnV ;)Dk?*ٍ5IRm67*%k~yw;ֆ m +!$_VT&ajFƜ>(L mI] sƽSRhrhBl':T!SkyH;[M^ъ!~RjXad#>˭*waf6~α*dҞLBR{BZ]REp7ߐ77_(1x3(ᚎ'|-GLo4'˖?.;3ϩr8A,7Կ3^O9G{XD(@>+-vq'1*ql+FYe*V/F[;+# <"<`яܣ)JqMRΣyU΁>!uU6 d6`fcWWJա}i4"9s' p2P+̱Bԝz1 6\z8cZ[j+{Ϭ%8뤊 IzB 0 n[yl`z=MؿOda#@%ْ~Dl' &ˊbNt,_B ;nViA:f F  bDQ$E2 q2΢ewԮ 풷+&K K-hnȄ0/[ Jmj{0?{R>wF 5 oejZ{Jj#Ru~qIzDQOQ7efO[.'a}&ȃ&2!ydI5.[rJ'q욝}4Zjg~<ƫpy{6:KWMY9}d GtP&2cq9stKҏ[ \/C0M/Mr o{׾V$Y~}&z.IWpA;+o?݊p C7d+.H3({{c5K..ޮg7/.%(_|x̭pwsQD6fRg{{ɽ=Mv$Vui1,fYޑ & k5lrK>\&z|g?Y78W l}AuXXq7qI3)+b Gm1+i K_X qq 8tu⸒N]AGz4GUs˴kJ5Cĩj4{}AB~y᧷?S~O>p~O.&'J$s/h xEתYMvmz:kEC^k[xdH!v\'_#GۥGYs8JA+&a<`)-d&4̳Ƙu R1qP.:G[8̴KUǸpr@Z5:hoBDWjD )Qy&I8 p} BwQi'v*C,3ֆm#/;3?U^K%K&7l{&`r1c`[1L?i\]!![ŎUmfz$i]bt Ke`\P~y^Xyd2$C="@\&T6l]~e10P3)E/O[d;Pthyb3䁇"!-T9W'3WJBt9* S9qf!7FJ1i_;- . 1G P:MP2H_nJZY[rW̱TfFU(VL :hS>* 3?:2 JhaJv O՜Дm?qg Z2ZA8<&Ò9a{e+\Y`$'(r YĘO>j dG+’Z,"vDb4 .;ȁ&РDS %mR&,A4dr#BV|hG*|вVl VhΖGaIAX m$GuzrTcƪ8=hCvѲt׸ z3.5,5&մKD ?$t&l"A2~{rR0+)2m+^T4_9CߡS#W.@;KG{=ݢ]עZeVCf )[®F5e8BB>9/Fvnd:$ 1<(xiqDT3 8%ZFv4#8SB9n#Е1HՎ+ѯσ:ufc;Obl78_lH˅r=FbR!9<r3 T A'%b:ڙX _S-?Iދ_o;lbz%e\MU?6]啩Zh2|̼ƸQFbqq@a/}.8c ߑ2H#I TVzZjr)DAnaQ(Ap >ݐ)i7e&"wژ!C_`a"[#iH-R#Vl9׏R9#bڽx08غG1 bY5| 3u0pPMXgåU`WRq^y2O& zեxSv᧓pfN>R,! Z҄A ! Vjox$F F,1 n,qBD/mb:2K32l hOף)]fET,{ﻊ7DWoxȠa ȬB$cHHg!<1@>fsU lN\;WBޖJClFrz Z9)GvWk<-Kq p8ɫbNV^({+|ڣ\Wtsy\3~B% Odx!WLB2^2Pᯱd8Xm#^߯ha$%ֺ*e*&dE8ٻ6vWTY%h*Rg4i[_d ISDc,YAL`Wh59.GH9JT| U%H*B0o54dNUT3#sɠm.rU[nbԪ n9ѵ_533ii86lRnr3]lH܃Gp!OC ==fz {7Mbӱ;-}e95m]?;|Qm& mŁn.w57zm-w>w_j~}?12/ld{7^ZE>vޓ =6Qr$c^4P~ur5qK6_Ǖhf6sی3;".IYú!EPCX2~m5,Rwg3l{6s9-ADbR,yD,MBZ}lDT|3@K:O!8YnJ8F?Xڟy|x Y&>y-mrǠ c=`T8kQ *UY-r?!Y򥕃?) oTok9J"er9L TPKAy.yB@^~ܕ aOuw[b8OQsWeJ$]$\+[1H#BNPrV) E6]&W gS(Bτ<r.l=tk<#ΩP*-FP3j y OfM5#?WF7tfgOs4ڢF 8Ȧx!QbLe;jw#ZqhdФ~n nB|d'ě=bwøRFagQk#Lҁ9€E3GWS,4i^0 (j\K"pGE>qUlnY[/ř>Y0~8n сć;Gn`o63Oi# lY"LLƔױoGF+onӲӗ Б6k◧"ǫ7cLN>.EP& %-e>"_#GU9:D9O\\عf5;9:Psd7sRC䭯ASfW]Q&!ACP@ 5 z=\T'nqR!o*ryczd(L{|W{=k 7o>hn 0@4<`91TcTris٩\Z]Z*I+v? 9ʹ=V`*C%`Tɨ((Gڒs=5 (fv:pUgc]I1HJ&|336c|A֧O7Tm(/ E6Ek-Z&L5*q52'@ёh{] f><~)Lol,~zD\Feo>7i[ i&wC^W% <͠= quus>x )oCg3NHibgت jo}K˔0Pa); ܷwuNk]"UX1 Wb;2t1@eУw_>|Bs̹Ɵ3i_3_|Wn zMp>D}7ѢSFmJ6[.beS߰Eov:b*+Hejj}_e>5cX\=jPY'c7mU.tѪ߯kfiGV^N l5r")k=L ' jVRFOLEk(q5T*1iHMd4)Ѩ5tp2!ޠΆgW2 ?m:U> 1q;KH6&'o=ٴ{7~L/ruEFдPH^¬]jԠ+{6zp`T(TkZ0!z׹XK9C(j#rԘMI\BעyZ;HU@72vadR^ IƩX;c#jG^|[l1Cnn^>GClJ[R3CrbX-ErŠU2d[Aes64شCPPkjc5UgsݤTΨjO vă(k UT$)(,TH [OP6^VD98 b! 3 V4YFŠi q_\j+[$F5˜;aåQ)րR|3" 'D2$}pκ⣷9B*on(]+z XH4ENPAlcv`u-6b>fQzQm}&@-sw^9_?<,ؔ$_g7)9wu8ℋ$eCg\XI kxPD[M)a)Uh?9pRt\L v_r< |]t" >"*n~(@cϟ }zOc:ܐ9!{pSg尩aS/æ^,blka.6::LBM-# %U`u"g+9B̢ aSn8i$Uqo_ؐ[v;Z6@EOn;zvڢ_]4m-*م.e D74@ ɩ*N(,89_2?l W{)5O8Gi25BO"qB6BD1sVԲ Wac+%jS!s`(վG 3#hSW19 HcpAl$t &zTkAnGwy,ͲZțl.?-UGzd'\k,/Q7:*MeEC%5-7dlX1!f;X":I&0in7tIG"\){JFђMЀ*" Z[]*k1`4,k|TZE[> *SD1*LuTJvֳn9Rϖz_@68\~ ]:*qVc"m]Ma UjF_U$O*SU G+?GRoA6e;>t1U&ǘkS +ZډjuZt?$Nz]viQ0j2bQ\H-^ZZZ&IptM&N =æ{>̀P+*TY^Ų/GS, 2% P\ %  pcs&m}c brȩg:+ߦJ61 %od5PۚU4I:xSԣzlEUs&!jwt~Yl<Y !&&KoȺ=.HY HM^)ؤB݃M6N-(?o+ ҐeW#AIa @6WVrG^ -qij !R:9Q}Q"STVd!j2tvLM6F ( 6HSH![6-U`2WUzuΑEQr~If^y,S#dtm;M+ymߨ$*RubjCVc& PyY_ 4h<&bz^Z/dгAt`0FkMQn?k么>.J缪jfnxw]*XeBS"9QBNW39b'_|$/5WQ*p1hle`&.j`5[*LO ʥLWNfT*1VZ+R){`pR-(U8AQ9\F~M,ٌ͓n}[K{;pYYZ ?Ěk9]#rW_]t \k]rmPSDok>WGTErz̪Y\=2&kZ;h)eVwջk Q{ g-?Y6߃W@$R:RIa১ ޵qc2҉yݴ)vMX0xU˒+vE{83F#6eΠ 9!<|ύP1#iwVQLf"Wb$,̾kыxryXXL9oNi f͆6 \YqBٍNPZ9LV,]5#.y_|=9o[gkLω4_zsK|n.` >\O`60xjI_e[7N[Y\,a#0V< f0b:ɇt>{ɦ +A{׶R|Q{$uas8r1Xctm|*s3S5KoŇ/KʡN<]y tEDǻᅦ7.RVģIQH_>ܣkko5UluQ7CrK\Wi1W,Dxh1ivWKˍ22_#GAi2wVZe|Q(p>G9 AJPyL\{)t'(=Ôm Sj<T@3D%2% Fs7zN=JFE:NAg[!o2dWu6LcDE#%Lqύgk ,L/^,MVwڒ-J PkS0&P%H`Ҳ,FgBla gR-}&KSe̮V?/硘vHQq{N*(߆I zз}: o-&׆hȳdGv}V- ڥ_z&Aa{CZKxf6 3 |THCKK)FDoAs߁.nmN =ۈ|\ 2bJYE <#xmPX MKp3`X{))8%b"pc1%0l|!RGC1~Dz)v <TbƜSYK**_>u1?]JςӄQqH:Sy 7GQ4H`"ʝ5t7m+ތR]YE1bě5oIR7~0>>H\~ :E~fda2tRjosbKO?d/_gl/~ey !LQKd2J2 UPr"W\Bi kK>(;ɬr 8MʓG&czm.OOUi5--z4:(}oMk"]%>~Sj,ף鮺pĂd/;in[W{989 pqSZq8ƥ4]"'=U;oUZf|Ҳf-Aژh> ViƤuR2u 3d'AJ˄K*BXY#z)#"eFQ4`pHHgWQ h_z7 _4ӣz -'u4ד΂)߹?NkJ`K6r30LR#eZku;y-XhRfTetۨQ1,%! PپWG$9X^/ˮ Wtat&u}(>*'7\ArS$(S5 0oBtRb8h'9vJK#$!Pǂj ـ+8aҒpT͝-C0!B zg՘IDk1rA5;+̌;X 5&*tIK̍A/%m0YvZ%;ӯb;TC▔uwI,[CTm&[nNݛ^M6[a@4 ƮDn$mA,-œ"^L0##,Cb c c@>ЫFB<wIb+d`Sn9KVX d (0v`=wۯ'wַV>t]t4!^.a F$rykQ x<([ǣZc;Ɠ Gs߱Vqy,e6b)s˘X%t3bxg 6HĎd< Ol%kw18~fw3$)4T"K0IZH! gx $Y3eR[Ewl>M_mehѲgw9CCۈ(-v&> /`i7n WƉTMt1Zl]JLK8wRI5V*RH|ix.6#_@r d}ZcwmodAo%b~7LIhEC-&˵3[y19T3I԰ev$2}%7z4 ݎ&v8ȁ:|P|丶>|]Nڅ`-XU--u e@ Q KU&twɗ 'L^^3^7x;װ` NP6.# <=H#!NIgi0]<+*p&ُ9Lq"(%O{:g"RbݟbzNqL9:r#1q(4FRFu0EB:Wrb`GuTMWO ,r@LZ}m~\ i5pex5cQa`At;LjK-KpjQAgTB=B*`a?LT%nԏ:dgst_ӤmX2ti8caׁZEWYp IUp;oB!1qV{1gz4‚Sw88uŽ~~&=AHt98?$2SyT~(HE AgzR []MjF%y\%7iQh 5(O+Lw1 ET E Ed EZA#bcKDJ*H"T D8˅!R K!"ML9DŽRn&x9I#""8w9lVJHH!]ʅIHPlƁ7^޾3g!?;\$ŹYU4Mi}$# spÑO_ $b9酣[#p>撥D2Wj֨_-x.XM?EDS̀"nE4JIѢfr[p1Y6:< jx] 6N $5QgOe7Rٮ^8ho i4Vzm3kg.fzyZr @?ke'Yٮ y6jŃCI4 YlVcS pxXҜ=hH`ͱ7MWX4ES_=ߢ)Rh @H`/ઈ "WEJ=\oE9 H`5WyHUr`ϥ[^z00q5.\&},(HY*ǜW0ձKK#"\q \i?w*R"-•6ӫ좭Jgyqfe+ReFod"~4cfclN\Y1 M^5%h,H.?Wl6쒽=Cύ)(7?,?ݟ~yNހ }Fb#OMsVR sim> \J&6Uk:jgBeӯrğ5:_k??N7wR۵^C#qy_rW ~v W3bQh{^%FlU;:Y'.h%E.\t{fRa,0_ou 4y]a" Ur0pUkCSfL FXNHD4E<מ+H1HpDe`&n`xg.ʒ~{OJwOb}-Q9\=`ˠ&;3L?fQ|ɗp.JuN g! zr .zLsFVrL^z<: =|Kk49/9if[䔓".NFI}ҊؑTAB]ڼTFWFO =& \jAfPE$KO`%> ]_mc>{v_Q82KHQmnw v_J| jm; 0^ ݑ>mLپX~/h^NCaHx:>jۋo{^Lg_GMǣwl<~(a\B2hi6̮6 F%g)S6_/jH?H/_Q=H]ן0xpl7.,^ HEZc(:&h=:6qӘ/dSlͱg`̺uÆ[aGefnm}4pT6DV'NC܀dՉ~ Ѳ:IoM>ґ\hBS]&ؘC 1GLacw*'"#P&@hȕfIW|3ѯ l5xp>݌&` پ)bˡ./<{q#C J:V(( ,yrq%hV}j"C7kTHKz$fvz\!ɁJ:XDb  s:5ځ%B IN䀋Nr:$G؉ %Km ޻y,QJ^8 CyǶM!L.R1ى:hQx\I3B謏@EA,EE^M}jZ\>HΏ3_o֖xn~22׶r\l;W!im$f0_ɼVbP-&dv?#~t,iKi۬cr5dLk IjPC^Ic8 |&agA ڈDCJѪ⣝wR.uп[ ǫQ78xZ?^^łڕ-L?Mz/毓.O0Gzvю]Ob`z"Nzj-5=]R}^z%J1Q*ttVoQ<(mLjmq@XM6^yKK @虁S}Sˮ<%IA҈!q*gyFt\Aрehɳ:5;`\YI/4NY_/7bߟ [M& ?PWzFr 'z-͢hۼqc4k`NU$B7Y 9d҄51p'$HE8sCDFu"+a5qvakԯ`<D"f@"hQ3Z9][s7+,ad@ClURujSp"?CR7-$҂˲,t7n%8)Ei˘, YJ"Kv$֦BBFUz&hEi$A1%Xc5_u`8_u%_`\rGx z5MyCCJ"-Ƅ{k QdQ{JHk)i^|+p0}x niVlz|.(-ZD?)'~\6`ݱbz)@^g9om(t]0L [{倈"-$v΀9 \Z^w9 ktIE)9`eL-*]ߡqJ$mH٨EW1U=ݘvxQBD !n =:f{%X^%/5˴5+ZgXIz*E|o-bU.#`0ML%_.k~G0)ptzlv7<|lmnGcM&hl;QRI!U]IB:`Ұ U'9KD$^8`]୷@g6o=@e IUr#gԻ3>kLY)ǎV^e &G o8oos:R;\nx-F' U\IJD#!"E"줖OV-2ǐ'A'',vEm& `RTd"P"U[Vb,II,ٳ dJ*N3Tzyvkx6얰S V> ,?[֯*glQ9kct +_|F-4Ncښ؎s-"8rdA (\HFzيRaN293Dklu>4pTaʅ1/ Ks=2^FI_g{ҁ('ijQ޲5WOG}^f{_U.8&\Ǐ* #wf>ϋ/].+|A<vtŘyudၴl T_PRHH'@0LN VyյwkDȜU G!r|=NyA]<]T4oV痙O9h? /?ёjxe_tիQ:%:b2?[SǿѴHxh+>TwPKϴ{+?~8Y|X}"8CxZ&k |m/8զ,?N{bLFl 3nFmFi.2Ĕ^G U;l:9]/|xqqV%nuf,?뼞I浌ԉ5fDZ}2gUFܵp]qGZED4-D]4P}Z$9|QE  [KP |"Dc26/ ]68!RrmGAc@o$R2l(rq:tܾz3_eu}]{M8[uSc%2,DdԪ*x.B>,l2sm;v\oO/ `Wlu&iB<:=CY ETμ(H2NHf]^PS69&FUEa͙1!W-gߙ ʗw%tv_#Q.^g%Eec".L,^URy* "[BO?1g[0$Trmq L*ikQgml?Վ R2"hPk0]blvKrlU,dL5/)d 2`D XR-k*3Mq 8v*,i0F’6Cx+HLleJY|MeFk,ZgIp駎_δ'E*|j`[B<)!e&FV90A/"0L`G FȎul ݣȡt~[m}agyYIwC_/n>|sOxe$sM86k0DA`022J%efPMxzFZC7&68/rҔcb/_o}Dd G nߤeBaRSSf2DX:4VL.3FY18v+q+ !wyЂ$zF%ZQV_N 1vA1"h- Pu-NdjM,QG% YkcQ!)2$ #vw-^hyX#n!6jGG^y&[f9j~ >|"H+cuޱ gnqTzD PKyޞ:)){c~:b5QW\E]UjuuUtM]=GueQYO `㻪uUER9+Nj;uZ7`V⮫JRM]}3Jo@OVO\|buu>I]}v\+ԕnꮯ^j`H]U썺ZuƂWWJz:4]Yg"[s lچ}N"fHQ䲩xs0r[ӣxqjqwtz3xg[r%\?tȇ]%ֆ [^=;Zcnl}%F1#_CQ?_T;zwy9[=fKf27f? ݊[bo|ay՛:ϼ ϳq;_ ]˰aqk}~}!oexE?#U% Y=Sa"MNV> YD0@%RID#T^^.)sQ "s&a,ٮ4C&Kw) ,sElCb/AXHdόbŲײd53x-v5Y)X FVwe!LJ6q((Ф$4 9۾PӜ<^q5==Z|ݕ^D <:2Uq;$ D)BsY88K0 MɔTi63]_zBπ<IJzL;!S$DʐbhENl9Hy OU7ǀ썑㫷zHdɢ$('YI>H-ʚ}Ġ6}jf-C%Cz)j]"_jՀ7 a%>;4x9+]eycP*0*P4(M=1cM0!9o.zLLe/<,6>@QY lO*f:??lqTǹq\òLgi6?+~_&]3*cߩ_Q|=WK0+tELӛk&;8{s>ݬYOꟐfOpm^:q|um0YE.^ `h?l}Z; ^e<^GA,M؟yk}~[>oS]vyW>[n~cc[^"̭ohIjC&eC5x@T ¶ڤګGF~[2آZ-5q3fy{ՏO:}^}JK {=.kpxjsc>}BE_XbrUz^ HzSt iO`uZ_ MjP8=3>iaguiq.]w.]hWbcχC"7;‰%KUTT.Y1ApD!D2F綫̅bTЩf[DNK S Uƚms05zdL>5jك=ox]SUWW3UVvԞ Fsr۰#U3QbV5AyӛVk][aM]$;D3 H2} RǡKH s@]}zDI0?\3-o".wl6`{l;s\p4Fwiy  y&kN3;GNqpc/&N'[No?гow?GMVe~ ;P^V F\jcGJJF p4J1iEzwcQ(`Z6;ϿۧuDt4%S,#jY Nmp5HV;<2qzaSiJ!LlӥEekTީBUƵ}uk Ce}M"fݻ~*Lom,o~zoDdX9wojۼcuz<1 x]v&j ޫRL&"0l(P(_mC1&3T:$iu1zD1y$-&ct&M71 2UE1&ՍP4آ @HXdJHgPuzC3qvo(>DSPm(n}'_|uGCĿF{Oy#꣕vWǮ.OČy1qI`KLBÃeab.Urdbwx`?Id˰!)D= T:QI9 %(΀=2N)1tDȚ"C IJcJE!E\a2ĒbP10uhfyϓ 'S;HPN[Dғ .* PAZQmCBpA`XL7P%2@8򞝩drruHh+qv*z_+UjY8pzu{*kOV\nǡdYksDY'VrɝuDJEmʻ9YZL4dIz4j# 3z KzkBnֻsII~}. $Ca-t$sKQ`t, R8Wi4P, XxR7ov/nov~;e㝿s\_eCg`TP[9ɞQ|4Ei h Da!'FxU4wCubB$Cs]P-HEw#Zt7\SAfq(jƨSuFm!Gb́Y쟉@A36IZh{m1ELހf!hQc$ʝ,G( \j3venٍS*XT~1"D|pe Viᝉ.sFu.fzRfx@hWz! BTvZ3G_=J NGdOZj5uVFC2LKͲgMp1 8SOyavwa< =-::V:Z0f6/?`8Zt3Fao>_v7-6n7W'hI5>Vz߾f;Nr3Z)AHri{<<,>he^թD+(Vc&ө-e}pgSX#Ҧ&Ua@7FK /0Jpɱ;5fkA'($)tjKF'S0`뽛fz2!ҬEy3_. K%t z)-o|/!/yD_}NUg9|xCuR\7WHҲ "ik\TBq-۰Jbe]6nY۳xC`LGsaΐ7m6N9 !T1rED(Q9 X 8!Azr.ee1^ _<&R"ΚvoVn>|5liuCxV*RʔhH H1E8%U2 hi_8OA5S,drc d r)?(PfZN T)o c߫xE1d^ENH92dXrڶVzI q&qR17ӉeúQ xZ1WAH)91>sH)3! =X[fFקdR*ȡ'ߪyd$6H,Zɽ$ږ$6dHPlG# 1G;X*LC:z3+AoV!'8.86 Hb(69k)J(֞mA @-g' Z@EasLIt8{uUz V(%Q'zQGcoQ?ld]d${= Ŋ{NZVF׳x|Z~ۚ`bF o*Bɣ #%ЧXת QRtjl]9wH2 kU,$$t >mm%%۳ūlYm}n?t"oY60*tIzwcyS 3ҷo̸JiVOHK<d)u es θdڡ'?<O?mf?f)Ҭ.0K%2``h*!BxImhhQN?{7L۝wZ _HKYd0> k բkVSWTiel5S{wa@+a>/aONc+.m  ߝbpiV1%V>jFnfMĘh(|4\z|fSr_^٪`:VWU #:.i>GJ@y(~^?⇼7yS±qJG/q(.Cxيe)*J<8HhtiwkJ{ISv?^)M~}?1_{z2I.])Zm%φjwt+xgntx.:a2v}w2!)q%:IKCׄ.5RJ(gucvZ2q pE}wi{uF>:w*ST߆~׾j}>kЪ w@1r}[Y> /--ZvLecܣLE`؟L{IH;IHixIfR-y#"N \q fwvp Jk V\Zjv[qw:zpjX^?:gΛiY4E=ڀ).QhFESh^EcP!=+ذـ)>r\iu*RBW(5•Ec>Uaoઈk,Z&U;J!3B=/Cp(4\=N\§eqR~\6}܀bgpWE\4W$f|HdtpʢX"փQ{Boޙ;%HUlgI2s\l Z!8cR5)zhqVHrCjn =F,wmY.}͏z?~|5i{i:=8-EC= x"bhYJK.V" F)aMH7Iz˷sl }G LwUhg}tYqhx޸}w0Q??i;MH6gǁb ϴ49+N[ΥP%'@ i5X ׆΄ǀaRͤzח-O~?w:6Х6,ݹW>#71㬤j<\j~oN;qՄfZSrsd!1G7_h6U $+GIXgL'( %]Xaj$F%=6ihۄfnQH$eh&20ɹDnS`cY ̵[_XSI"P-3 ]RMTS'0uùS-;q3w#OZ! [^8vaG?_/pg$iwݎ$}w*ڍd;/^٫ˮ*lygtV,z!J S(&j[su:NU3X{ʰNg5)Hjm&NPBta]͘4lr?)PiOxbO/tx"Q>hO}>] Gtu& 3;nV`?0Jn5J ޸,s$|ӂ,|`.i w%Q-R' %q>- U B81j1rB'Dž s NG`-!r'`cUds{V11K.z5mc^DY9LQsyr@62VadUaa5 ue,  M. 3>[q+=74n0}ǓEr@8$c q@Vу @-5e.aiY0cEBRckL36 e:Y@S48# 㒉35kiǦQ;}F2_:*22@%rf g,*)+ځ^ha.x\L̐+LHX Z8F 2\0Ft~ޏ b5MeD4"vx/@BVK Co1p]d,fX^0lJr5rQ\pƓhE9dB%J\_0"~>M=⸙M]묦%"gqv}hAO|v*: >aFk%aVpSa5<&9y[^$}jwtS>Wa/nDj~OS7q '3ד60)7`%t֨](]( 3JKFé;kSwbSlںu&cX@&hU rݝJi#*Sڕ6kL'-cVkdc+[Y/0tg\I3Fz#Z_$_:R퇆֔w^( MhE#4QLU/:6ۥrGNY7Z yi !qZ"uI+&% "r&7̭6sYRׯow숖%9x`h/dA2NY)38]`DkXxNr4P-nh}6YaW/PH3}}B$9k.M'"0ϊZWa@d 58UJOg0e[#eIuraGE0bLӻ2bX1SD`(LY;>?(OXM(85!!fvϯ*v9 ٓ IQTqX|fI8{( ؟# XxMqw>{o+N~Wpx`_?Spif죳QVNgGi0ߣ}:I1T)b֧sa#A29O wE%KJ* TnB+(d\Jې%ih(;; ly;9Ȳ˟A |F6<9gDxtp\n%h~jK(X܁Tx<8 k 6ljlH!-=Һ4)Ol`+D(\!r -%& l=l5G{9ߧ^|pϺy&%&lł 1`xFAqA0KXH` KIW)A(aHRsYK>+!G!]^^* _N*dRu!*}ܾWMHz?No wB^L0b< qNJM`ﭘǸJ𾣷|?[|9brtoyCS`L2C TzP ,JNSH9DŽRn&@F%nB( -J)5F쌝1"hBh1д8UEyk&:΀E]b]Cc1?Lo']+ޕ OI>ƒw᱀Č"s1 lO9rS|0|O;?5À3ic(S4< JXK .d'A;K*Ljd<)VH^jʈ ` Xy$R;ґ{f˧>aMgӆݮ:C;lKNb~TY;|Jt;dXK"8!H 'ۂiZk vh5Z:̨ͤ3hQ9bWu3vn_qgE1m7Verso)dv޲Db1,*-{>?pG"!DBH@ys߻_-ǽ師#屠9C)_.*)uDw؁$ݦҕ!Dluނik3jX$"﵌`MMJҗu;;cgK˱_8ΫX-'WڡgkXYkqFTU" V<[W4@n[H7鼢 QeFg>5 L:Q=V_.&*9ԮjupN[ܞѬBj捴nOFě[P^y<$㯝޲C>o#`ilcnl+j2, :VqAZ.\Ig{IlZ;T嶷?mEW `j tIo+f>@8M64iNxV}΢<2EנfbgM')ڃeH>÷=.Q?Kqئ sXgQ,@W]9_X *޳T6Цekn}!հsy.-vWF]U~^jb8,XNԲ4Vb6)kۇo~jg)+W#}] 1;/(ly׽!קt>V7-eٰ `DY&7ϙ!,7\Ub'`+=X;wkc~.b>1 jǚ1ꄔ I"%LQD-~A {p6]:Q;Q]|yΗn>d*؉g#W.n^o>Kӹ`bM JBUbtdEH aNBk/&~E#$Ynx+E1 SU#| !FN"pR{ w-g) 3Xo`<)=-ɻr *5nQz44!>_ h2 I#W֢y$1Q 5G,O7͏^ F,bcnSNFL@Va0T p}_`eSG<;*Yq%mܲaѫYً&}ʋ(fFfV*ycv-CLݼN댫MQ \ ̴cSXsha\qr!+n0bmM=7=ZV-,Q^J9д$biii3mQA3#KѺI ՝&j0>mZwYpd&ƌ5¤KT{&Iqjƛ3=mMԫ!LlE^ߗ܆|"6U?ǚ6BR5\uC([(jO R 6$V$7|EI.)v8Ba{-u ~4a}L$,#=Y;YΌFDsyad>oQ7eq![JR?|ۯ)ux]4J4ETŝQʸXH5^ ^\d!֗+re[C3:((VWk@)Tgt'f^[n=/Q&.oujǻYϛn|1]z=O7 L96#Мi\z\;ޏӥ.1T|d[˫-]-bA欠Ťl|KwnN]_ z^]g{ 0v!+hĊQRPIiL-GXŠZDD2:zC[+\Q;+P<c4_` NP6# <=H#!Wz($.2j·;Kh[.L,wC\3tPJdG 4!CpTpTguVBgQy1s[@Δp120 ,870Z{#E/Gqgmʌ1K(T ʅ +<3Չ9SǹS!Nreqc-bޕ\?ITaXx:H2g`7Q+cIʶf}_u$M$júUA=2:pBFQN8e%#O-Cӫ'Sz^Pm?1͎븙ێXy-օuPh]0sG«+K[ǫsF/,렇Gg1 RJ8B՝ye3͠ }NHB:uBNڝ3rؚjPgK! #@c]a|>SƉ5^J/vQ&Q>piFeD.4 \ո-EZI-3B \ȕWL2%PR%?Ė`D ӲVMgQ^>M2΢ *:G;Wm&a8,mVlM0P! Ls Pq_4 Hp1 J E(c c]F:+:  w["BBL`E ZJ2kL=mvgC1 ' sDWV׏3w67|O.aZT< ƀcxB&#Oˣ%!;FsכD-P K*٤yT aPN_j9s7ߞO(ϡoٳ.K6\Zoq@z]u:<&`P9dQ.F']A,\1O {-F$Q‘}>kV_%=[ >*rW)lrN$BE+\0~7„~[iV5:~zfz5*8@vI̵~[ncr@KmP{ ._^bp0x.ihI斄uS3Y],o# {],]:TclouɦVʙ˾NI>y8o&(RO/9-QJRU0 MB[Tgz)KSqȇn8"IfUl*EtUJboP?.BotL_/_!ݿw/~x2| ?L!vuH:"@#zhZ47oOӶ^|v o nƪ,X0 В'=&JO'ѓ| 41x퍋I'eX"\h@kH"IsiVӓa.\ʊ{~?C7g Q$}[<8f fsPLXiHGxrsR񭭉 whzc;vĭvʍV;w:v6tqLw$[uӝĞdƭ.aZ~bJDJι/b !tg|Ri?> e>y6.[it(@;aP wCi Eu4qZ0-xk"&r#AH9o|#TKl\oB TaxӏPyŧ+S,7O#DyQ\IA|F4:x?~4 Lc'כBM}{%B^-{-.b}zՆnl;HHK!"{}(b7pTxꬨʫΦ=*G}7R|Sa7^լr7~yBdU?B=X] o:O\ z ae/r+ fT¤#<9b؍T N&><I@X~p/[Nɿ莱x!cؒǀ5e Ԕ%Y&.1H8-T<Ix 7VS˼'\0y* r/`DALݓe5rBPQ9J^-!1@XJ 4kd-rv,gzJ"KF_zif ?uyAx7V%o_uʈg YФ@< .]yǸ僜-q2K`85=qpgŵ6>cD yE3!TkA$BYSiёf)K(.d@(4aI0 hm W!$ &mmq4FΞ|vzK- 5g3;-!GGWZp@LN#@K&UHޢ\e)s: ?{G^`sC\qsG e\Ȣ4$"-; )-INnϢW2kHb@>s3i!H J1dG+Z[߉^3XE{(x6J8p_,y@ d0QZ7ѴXQ)踭-[|8ѓ1mk|etك=ά2F8 A&O<J#!)g:IE%$I:h=` pGk+P-s7׿%",RVE[ HdABgHXVhZcՙ"{lqx[ a8J։#gΪj\YM$bp=G07NWrJFǑ+O}u_K8(* YՀ)EE"2ċPx(E+>v5rG8{V[yf܁TFAF %WI=|CgE J{&>[=ȋaܦ-mǸud0v~=+~?~^H+6{S770bT=w7ϰ-,t7!+-qXκ u4~!3gw[Ȋz*pPfڥ3l JT/#N]`g5U뷆ݬRg>@mVX]х֤>(&lIQީOMr/sϊ)g4t;S{ݝ|Nʟ&t aғZnjzq]3oU&~.%ѸEޛ뚒usHRp4DR!̕72vGli({~ ZKH(-.&#P^)@SQ0`TlGLx=Q6AҤRHJKSIܐNv * A€x#(i yTJpB&SQ$傩)g8P Ķ ?[#g0cB9?~'wѸqFėݮ:[b,7Z}!fzUY9|Ht{ d3a5IĔ2RDz&F*5.LAczCZ+[`LxgXJ5 gfҫAq$'@hg}|<a寧=cNQi[N|$0bhךfyr-h:JNRe#C͘-Eƍ d̯:ͱOUsR;)*A"U܆@$c"!h!&pOA@jK*pnP L̐l-@ %Չr$U]gaNsqgk9ֲ{mhu~n|nUp2c{TpHjs;u[ՙlQl޿hZ瞿IT^&#s2k*te]z{8ɎJevG$:UXĒl_eݕ\-yO,ϚGMc?o`ꎊgs}{4ݟ7@a4ewҡRW4N7\bGF' 8un%+۝g婢;A{OyqB7'7'kztL%]t''Xeff e鯣brbyGF*}c|k>+~rpޔy@smRe" FwQДJ/9/͹ɈHޕ6r2ؿޑ>,K;< BSbDqZw3CjHjJx!JiTwWWg 6^d< 7Jr<8N_XsU  +pJM0LRRHi™%,}hIQL)>}>쌫~y/ź-:#pH`hSE-"Tew&3J<-e\{ıKg5h%a%ESJ(qD0DdY!" H߫HTM1Zl SMUp;5V*!/rPw,Kxy5-=>YxU-Haɣg\t8*LgѤo&,R_k.D+`LAp= q\D1~W0&xwfa+iM<0w>_\W]-yɞN.OԟI0MOSXγ>ɠfɫcӬ ?f40m׻&c6<3%j:PӹzvAi>V䎇 39 kq][ضUAkiޛxOb+㷉z=dN.䛈T .VvkowP !uK(Ř5;(7~O RS'B@v;|`('EV {nh,ƃ{ѱ~ՁW*=w{do= 1J1+hNT#O{W4G_n<-U7~}^gۻOE1/x'I5=?E7AWj1Fs[xA1=zQI횹r9E9j; '5]"z^V3Xj+jSSpRii5\5JO!?.B=kvQ3m~8Yc L*Xҥw v0G3 Ķi:c"!VwaMUW!EګŅ~ٟG5;6,˃=sSM]i,MȈ J%/s;#rhTDNe,R/(h?^,!6JVwY׃j ; KM(IyaOqe/o` {q}~|PCl1~hA󔺿a`f#380 ѻ%zӅl̉j6="NhXwĠ=%R{V0kĤMlGvDI +'ȉG+}_ ,lBN7o<2څ@s+FJƥ2! As)22+\Q3cd)ᓽ e#C HTVptf0a1H(Eq 1(^eQzI]M6·ëQ^M 16S*dąG^q{RJ"iJ> Ċc?MR6y4w0`s p66).rTA:4.=`)Jc$aTX$Owst7w5bJ;/ӿH帳6ըuݘ|@ AVxd}&jc;L>6 r7‭U"bK(F[ :aFhNP6e6vv a6,l|擟`s[rNľ@+J\0: /d 2ʎW| C &۝'7+wYΪ % 9VΧQfs aI %ȗZI"Ms7>fw6iʇN E(o|m(FURbFx߇(ֿ \9 k꛹i|9n|/>n UՙƢQN k&5=6iT/A RUS0{Ke`T)It_<\bҥV:FRg>L)X]ekY&&VQR`*ƌƁZcĜE NVF'N Ww>=1/}$l6f&s?EQ٠Y:IbUzM4^*|UZǙP*o>1Ž21,J/3eLfz[jr{,?EaU&ZY͖(ΑbN[SԔB;"?ZLܬZy{ Iv#&2 "#"u)1#J }V(GkQz/b'=/}Ԗx0DPR5c6vk|X%.&BY^dQ ,iP΁%MFboh\l׈uzcA/3W{dы׋^$6 DE! 2Z{R0 %A3FzR,zxx,0t>\P>< ےn)!P[O_kWNqwp_ h4) {W>jZjQn/?߂to&oߤ21# E+MOcx%x]2' iK" I\U.jY]Z^g5Xcڳ͠/| .p!P6Z%½Vhtm1mnX~{˨OةjjeE[|ۦ0^C|(K[+[4z3gkڒu"+m$#ۖOO ⶺޱ:VA/OjoO=LǩݿPreK(|#bR%c6(SZ8STm&,SɳƯBk[7ޘ{!c!,smRNZoZ*$CKK)l8 e2㙔FHbJ@Ab c@|P\h6L(,b},@'N2[Sr%^y&TDrr`x8(2 H`R AA Dśe31hFCjpE+]r}B g?YS L5A*K1|+&U2.I go|y9={؁rN !LQKd2J2 UPr"W\謠ApS0 mY.41yBet{Vώe8f)E zY@j| -5~&|g<3nJ4.Ǹ4=IOp/~~= ΌަCm dP*͘.XJtY , 8 ȼ #z!Uxd!R,HԔ3A #(H8w3lnj+'g>a>L[&>v ?Fh]\/6+ωn_2pM a%I#Ir &Ph$"uT`0n 6U8h&vth5Z`(3*hJUQ9bW) Â\{˶Yy)lo!.޵6rcٿR/B|\$,0$ Ziv{Y%ɲe2eݕPY$\[j˼X2u׷^'=O_c4[?oZm|h=rN5ja'M?nޤ~9 g,*[MU5<f?Nb|g3IP4ї!7*$U4{lf8ѤѬVVxUU@`%^4]6lR{_?> ~Rڼ frRt_rm?uݗ\K{ݗPE3X)!QAXX` :',vX[͒+/KҷWq鉖\N$ZAn!uYk|v %Agx R (KpL\xaF8~c BYR#謙Ni$X1Ϟݿ6{7jւ<ݒ=yo;o})r..2gS)DpUBBe ĭ/#LP2v#Q!ʌO d _ 0ʐx4$^@5Ģsd<"(1Иzoλ+@&U;.cww~O# r,$<%kg!8GR68DQp^wSxCO'Q%Epy0gBWFl $NStt E%#OtqvG0rp?3rQ-uH'ZF(W1r5D&>jɡy/Yp_/cޤcv_Q8d2R"E=]u ,ėJx>//ÿpDPXPD%'Z}/Γ ”Kur#Ѕ lb1MJt6WKMe۪|o:|"1'Sߪp|{y N&ٻṟs0k49VɈoqhMh;wh8[utۈ Cmw hG Mք3٦o$W` biXEx#5ҁPz`rҷқ.Y<۫xshj%[ ^]*HMA+ Ƭ %Pj)#McnՑE򽊼ẃ%u]a5Mm:׉?: 2Α{Ϥo 5XjC( 7ϓh<|9ϴ.%!OFϱn>GCv\+؎K?4_ 4:R4)tTZ16ay4? ׿}6Xjjש ڱxWvz=\b_hdgh?,ھ0Bti>y'혳?l.Dž_.9 ܮr3[;A9#],p/F3.D4&q4| $`S&IV '†WwYSH[.V6.~:O).~/b=Jim_6ڇFE)9cTKfՍOt /7m.m8=aV$M.ʿ˯ѾM\^ec “ǪuQXK2Yϣ97G@6|m "*`hp AK_[b,pmXq N[}c{BؑT0Ky5 \ 8 !pI>yF |O2Dxʽ!$VH]Hu42QǸָDphiJ#AT>'\>!מlMնssc>}aV)%&|^ʵiDu֛:|/e|Nk!T""XkvQIrڳػX~d ` DGyУޣ[vd)0Tx`NG*&R#(a(HbF[sHBr)\1PKwQß3&H>˗`D>FujŪUH'6x.f}= C Z ̦Y rT}fv 59ؑZQM~CpEȎ/_r> I&(("P% !}2 l/ PO/=Xe8 M6!vxG,N}tj15ER[ :uTSgTF:q\өh?LbTɃ00ۧyEXZG OL3HsX>ns/SueWEx0:z.|BIDN` ܙ!N H "opjT,eq>QJxޣ'@q(|ZF&uu֋SZ,f%Jc\4=.xGУ.Y%)NR:UrhZJɄz&9֪RFeSbq"xHd:HԦjȝ/bKaMAp}3E?"iӢxy$㫷X+0Ae,`RB@ j(^f*20>P+<45FaHa$,"M(ǞxF ~J7j0MKҽ՘ƽ]BV "0؀=E: TrmNqE  ݞq48Fa<Jhc8Q(,d|T5<"3 (\$ZX@X[P\+Mr,#sp`{jk#ML';1ٔH7K!yY%Y q$NX `AThn8`p^pe8=y4GselY5!Rm:`ut)srvc/!̏KWdE"&lb`Y@W  @q9fJ#(,"ި}Jc}O=,(+m0@!+Bؚq ,b.a6sRbj9,@?߰DXG"&1!`BL1jveIwځ3x1|kg 97%+j'+҈fBN 1%^8ԥc d@G1d*0~0*tZLTHBs a\+b{V,c~?. =E.ҥ >s<,ϓO*;rP$ M(!q ?JkեMX]T0mݪn.\wp D7?<'_yyuyO`2Uk%bk~ ?>S󦦆fSS֙Z7_c^E|pu^.3Yb3E`1t &axtYx_dGI*㣌Bq4)e Pc%I0c.;xdtW4\J,uR8[D=`%FE:Tkc8񅄾nrtt YVCw ݑkb$h;M]sRwp[N¸5i$ .R[2EA*S0A`Ls.TZ/twN$Ach5*Cz Y}2Ic.*5XdwJc`*i󨫫4&=y E*/  | H(.9sAUAjy[M}[f[CkO+ 'osZP}[\-YZ-lMGM)IV '-fIݒ[pO;'`&c@A蠎N]^fza)po~xk*>%3/m- L@"C&, LgYF\gw} A֯*' % D,h[FӮu}kW/|YIqh -2orjGGYi˯lFZ(|g<Va髗EhwRG^}:5R[`cqޟ tb=9gXa&J+hUW®ZѮ%!{v+i޿>Oןm{M׸hV75-ri5ha(ʱ]ԪhV2e+aqT#)k< GFaeƩ|y*W1ZƕEb) . &=VV`C̃詎F@qeU9ցdOP XD1A]&@UKۙ-m6*NxQVoa a jI~uKzax[0\9E1|\,-|8vx,\XIƜ1D y$F j"`qV;@* a;iM 9Hqiw]w0O|\vN k{){4Qƽ%6Ne?bI,Gׇĸ/V7 ̷KSoM[AG&z4JlWǢPau%6AWbN%HrmEmɳgEvt`PGG\+9?uXTU :ofɸ۫9d -FqvJ9^{ZļdӘ~ɑu^asOdk?Ãy_(joAsQS-C%kTȢ H S,-&MՕ?WEUf؅ut}5E|l߽fYAlFɣ``Ў5cuBJ$("ƈLvAB7*>08wÚYםC6fڛKSY|ܭkqSQ*_xĚ$*hr"cdA`1s"F^B s{ wT1cbO4X Njc\!N,E"n1(c 팡9rYQzZ4yw^7nhw@^ [9x¹Q@ arv/o-oG`QsxRk4x߳gy0xT  +pJM0LB)4*6@ ʤ"ƨ~3gbZu& ԝ-@@6/5O;-6wamNll|2[kfm|C۫3t}qtT|,Dnj*ǬdFR#J9RY*?-@}I%AJ_"v&;2$sq925Gp!:zQ>5qTK'/d䉦joΔqz ҥAxyt) 6uvŒ }H^K6czPPeu%wKoXܽ+\!g X!Ө僦s4 93?}ݧT+Ij) 3AËA7@45z>:tFy )7Xvg@~+hvG2t-?D&9!;:~\ &|-{_Ҧ$'j7 @V_." Z8'b)o=ZZ,BC1h.eMz G;äF nsW  RY! *BԊh0a1H0ŔB7L{3F^@OʝJnlӺۨ^IzMB+^1R}jԇIJ8aۙٗuaag^׶y1K"peJUJè&H\dmck׶7Z{>,_cqg#Fb`!L\#0;Sӎ!0uUqu4$]RNq,zaF[B7zf!% 3BsB!ְ8\|yL{ayZJymz@.t7?bA|ر}dU+W,; Á>LE $jSXdB]CTєc&63Gky2UaN&%3f6 r!\.xJUVgGno>Pݺh G)y<ˑp&/6C'Y[j%cUҦDl> 2UxGΧ3%w֬BnmvHל+i$}\0$EnJ]13idy+'fخ>vz5O^s>G [\ݞtEֹMGü[%+GlwX;\Pp~Uχe4 r;(М#G#:++t4>gSW?[[\Thef1֗?&V\|<*S<ޥE*F+I9N6C\q|O7_тta(I&+}ٓHͣ 37J[iǁ%=\XEUylJ)hn{(w%MFИs֠>dFG%Z4>'l@'@"wD=ȫ}~_ʄTi-j]hǣͪm7s#/+Ⱦ2_<] xɯ 4ʌ) n|L.AeVCHΈMd*P2KYHy5 aRcZW-8wc3/%00uC냮av-_X^CO:>ku`a=OLhHy:>LΘ'PI {TSgOv?Ga0AnS`n\9`XM;'<0XfgF6]lTTLg Y7eҩ 'CZpVuFMi\Y=ql];t\ +qs'^khrP=MNsi&>ܚ W9p熆}a9~7흒EϮnSd\hvmț7w~͜TKOե&!!IɷTO jՖJR&}.mڻsThD* ӢnSFLޟޯ'^.+$y㜽ω:Nt|ٓM~8}ݭ4|@u{anmnlmmaj;Y=)UNBwCZKAAe.!2Q1`3jD6jI74(XlPt[@'uey6YrrK1TKxnn+(P<*?XYF.x+6%?9MLK-4۰vrQ .ί aKb'J:塳q_> d\:~LǢ ci|B^5Bߠ.榤X .U+S/Jb~uzZ*qu`%1_Xྺv3E0H}E1Y\ưl2]Mfe`:zkZOa"}Z@YcZsSk R, Y( Bhp&R" 'if2eak|y Qכ2Id}22f4R J{҂.S,?7ЃI$͔5*i}g|MeZSxGih7kY\~LRљ]}.kEqHE&hr`iT%a%#˖HcAp`j}1-A !P&#sPQj[V,AY]`9 @YIt٢d &imP!@&%mI0gEe;&Ύvտ_::ݤt)liR2YNHpɺN&` IKfUޒ_~NgRߙ@MעsG~fOV8gx0L.oa"R%eAH!J8A&,Sa眐ԪtpH]Y (F$HQY"@vց%)x!X(dG1IӋCab$k].(#" BV&D벅C$sDiMDG.e ;k#B]SSjhV5yAeヹ`fhLP 4䂀mH7B8**id9H֙G=X*YMCjuoqbR9閦YPIΐ҃sH:#,zs^QeƘ2a-^R,MqM1Jr]?Xymz#? i(;"4l4|] qPF_ڣ)q"O>~ 1L]H?-{oono5,{./x ] z7XjʾϤ#n((2 89;:JjGn2Ig~py4<;J̯-oKIrX.1ԁiAN\'EpFVէ} ӗvt;w#쏋Vm߾=_m R@(t5x]]QkKj5޵>n/kkMٕ .x F}y6^ i{g7\ DִPٺf`}3RmfUY>!4>FbYGb'WmMSjm6VUW}Öia#aI8G_L^ >G7Ƽʏ0}z?_|}7OB)rx)-+_rEs؋hFD G]Tm"MupvHO>ON_~:y sO>&;isVh(4?ߣiYijoѴ0KӶfCFvz]c,8% 0 5f+La1Fy>N=I-Sxd51x퍋Yge 3hyj~9P4̴-#JnFZ9EK5hcs FbKJR.f [7 oSy ޢߎ́,糓糖OYFh P"sC& %2f9f18ra*{G.D>slG.v=I=0MT"  $WǘagKJ+F4j0lBd q2<CTI}_ G8cD-}4hQ4 ,2mc峡5xv5ԗ|\Gvj|@Qc%,Vɝח}Į.8+iqH*C4LxU8ڱͷ ?qG?j$P!ƚl*R#%)eNt, UE]' $SC3hďTS(WE27}Q.+/~vgiy^+O.B;mU7B>c.Czsw| 6Bզ y2 P)hɆD1$OJ;Ij9h[O18Z 9} 5Ng%%qJͪ7mS7q{q+UTGbqCN1(+P uwݹ{Ȼ;7i-{w&%lQiVK^;j63q<ܾUF PڀQ!3I0_O g !#9mD1+UѦr.K&c=n0v`녎mG 3GZyrx>c:Ϳ|gҵ(Sx}&O ,o< h8^C#Fu`.!VS[kL7ёQl)2 J~jb+,smP# .@ Z`Dw〘#j[$&S " k@UD(:%ukvg˭]"'h ~TCUVXVYELb@(BF[gbM}0* И !s# ߭nu!FB`^dK"Rw[=PEcw9X*9闣jMOvO]t'_MY6k3}gWuэ SSß[/f+p6_J+ox\N¬mE5#;-9 \s gtj)5%m K5|TjHP"_+FXc'3vqt4cW_:B;½µDٮ2pvqq^8,/ C<;|v:zJ{+ʦbA2Okh%BS ~6x.%AMEʤ4䰐|/U00W3Wލ%;])Ҏ]6u4z>"Q+ bB@1 ,xT*Na6)h碅V(Fɐj,:&5Uvаȍ\T-*KR"sgM<_]4x,~}gG8z% klqTl0kvQt7P }s/#ܘ-#m !geNadҀd"i̐robnlAeٰ4cZ%;E:0/>؈I:SN5"KDΥTwс%]b׺G*j(v_܇_<M;?cVm[~%XQ,Χ=^7V?!^x|qe0&Sy lS_^yBHK[1@}!UC.:Z !SRWBtil|kvwX&k#[6Yϛ VGv}e٭F[r|EOխMSGs's?<[9K|'Zi>}S룦PZpm8|0o6>=͏Tv{!<߁R?yE'}|[*-Ii>Nx-$FXUI%5$_7#=:D.ѣ'3~yK1{]iR6PZf/.o%r% JL#+9Sj6kI7D ME]I!G_z䅫kӥmz!@;0JaJ i>wW}?EˈuQ!Crh[&ah+y7"6`t1!2Z끳"m:KVqҦwb4k" QW䷜B()s4|}85ycp38_h|f>Lƍ2xCtfz :G㈎R\XT~ $nzY&ߡCČH x4T`Ш28HO0$=$߽%>/.xa9ăɽMhSmimjܺ~,Ho9l7Q~zzPO44u.âדWļW<`kbCU+(* f2AE#-jGY6^svө ~-Иr#7mM`*Ǥ%hb"9@l"ek_RE5b[(y6CbR[)JFGF#5[ق\|l{f۞'_NHa z)o/+mj%]"y{lCpvrJ}*ZBke "Rd|˶#D-w m_vv"/ |&jN.GP(p$H ZfW,B"ce ŬjW;FC`EY !!g|KlT ugK;կz!EaK[uCL&rYPA \&k"k WA_C* Fi^/~u5p㋂ֱ `.PwZlR-jZ "3BnP+_'w I^uAr5\ !3N5ki,΁GqX5NtOB= ?Z +_exK\PSn ȶ$ }@4 Gk""]Oyh82Gӽj|`i|[G mf8h 5$#!6ud]]qGjj4~cG糛juoqzkgia~˰b>zvʔ9h(P&u-bm>w8!'oŴOMee|~fQ_n͏ʟ쿮 ũbi;wOٳ`'q|OO֛J*}}H\2H־e8_N(s-#^:;`2 L0c3. _Kc\7>r;M!z%qcatUs6Q- -D$kW:躔`K"P I>dLG CP&P"*%ugK7=!fn.ޝl[%ut(.6X[F|:yA[-ZB )i+z$˗uya͂~'lhAq%9"kjF@s1bm&n%zbm@Q < ~=JXώX>]H1hEP<"_9$|}Sl[JV&Ѫ`F?ϾQ۟+4m=WdAB; ̙ճUUbgԮVAzZLU*e4S;JΥf5/*&'{`opR *U0AA /Klh nI{"_w8pEQ',y;V|+OY<_r:Ԍav3۶0eoVWKgO^>D>럞 ipnZĭIV47$?>C󶆆CSŶZ9قo1mNy͸?\Wiq"Dq3E`]to}Fz;N]$r; ({gUG"Qsđ /@iS;i'04v8̅&͝{OFo(uD!icQϩ:X jQ|ʽ^E$R&;>¤S| *GOd;W: פ8+@DTRJmɌL0&NX}\UK3|/]vlC2/YӺ'M -y2:6KvJ0v_CΚR "C9}6TdnP<1Ә L"ݤmFLEg,&Dt=ӭnc%o1@h-䷼ۭ|?1۸nq( g`rIgKdpߋ~u*q| Ñd\ĥR{ea>}tnb;-;o60E .!($3T B's"W\T}dQz&evsL(QmG0 B&*w9lVJHEF]k\}i<'8-IG'ٮgw{{kS֍3Py*c8bFy_\t*8m\(1㦴HqKST! ϗ~_ɲf-D˭QiDJEm]e(aYIFVOˆ^FcD2Y+ r$R/5eDDLH0<)2rl+GG|}0>Έ*, ( cFaBa)Pd =>݈2E4^J 2Nmbh6m}k\zLe`[MJP!HsR~ji!̡_n<[~v NF!8dw.(2f K18("Dmeδǰw`6:y r+Yd]}N=B=7cZojY6[ƧM3qZFb0r# Dv qvehX)Ql=0"VA-ha[0 Iˊbk\( k, 빰4 5$oC> xNǯk ϣ-+0g^s%&0`HR@.E`H%j)w D%C+ĐJtO><>hӏ`UK)B KdO]HM xO5qiĚ20=fC$LmWL!#2`zDǽN "վUZ`B<R`[X1c2b >5[!--[խei#CiP]wM٪ЂtY$Nvd}'-:\k4 Dt>>;fzv op4\ IrgˑIvSդd$])u ezsNmͫYw-KhYθun7-丹_X٢祖!ŎotlӸ_Y [9 ֜OtkEo梘06_NGyZ}NxsElޛ}e-Gӻ Ě$*h8AF\^qb"̉@(yegxy$D _`@>&4!#|ON,;d c+d)[+CVrH*m%zW^z'ɻ-<ãj-(+gMV<̀t\(c`D0rNYü(geIL|EͭKY#xϢgɒ<2ISv%KPrgd#H/l#iE$uݸ~ D1rIpܳxlV`*%-҄3KhNe i >z Q2]EU1RÔVT:vvDOY>d!R[n7A&LQ_<*nOϴ83yV\8OPӬ[ m/.MdR'mD* WD$!`ZPsmQEO4|TsKK^h6Z˷LAӾ,ib$JKZ|TTU|Ob;(BcoL|`⠿Nǧ|)^6ciu^\_Zm7ۅVm϶f]E4QIW){8Z" sxZV}Jg]jjW:tܷM둋azRz?zTr_J]C\,n1<$q'D.E\%jwqTj Kib)ʥHzs^,MJɴB0hĪ9;y{KiIK+ i ꛆϖjfjIDx4j-WIɨq1p<ܨ#m?϶m޶ n׶mo5bϴ}z+[oc!HF<튮I5)9V\NgFα@R[  z9Ju1d!1V)4s6"*Nڍqk:N/Z~ibI+M Yb-iM`VW9դ\~w|!ͮ%O`^27Bߖb^J hÞb7#;;ݠ[m.$;Oj3_QRPQyp:-:SViv:]4.@_=5/M핖jpc{ HmK,4s6jwig8۵3n +>чt)(,ɾ4<_>ͣkyп݊ =S!;ՃȆ\ Y?ϧl"RZOOgT EұB% b >%Wh| IPd~X—Q]p64usX^F[dp؏nW}3Kl;ú|lGs#=~1mwn w.w?yt:lXB{?OjuP7֨.&.u#ޢS=z%JHRxCZfp?̍VZGP-~/?[hw{v|خŏQZ]JaFx8R%YEXH(:-[( s3)65]('hJYLLHA`R,|bY~qbj#ٵ^۠k/ߕrnd}eJiLBo;[K;]?NuU8`~jAKJ{^|||E9%3|wVC@KVj2a6 Z8d S`jg[u箩N=S(bڱ?(AwcM3(K][z/73<蟽RIpO}^pW"Õ=[֣4ؤI8/}_jB/[9g?ѓ {vyg@=UEԥ.Bnŕ@0ڃQPZ˭],RkR5̔]IP!ɈL kBOi>}_CO Up8eE cېVp.),nϊ nn;9IR(,z=z{0^ՖQA ,H=*\!Y<-5\EO튶4?yGVeCɥ{5<_=/GkQY7t?ή\{ӿVJܩLր!J-4J-Z5%t}<Ø:C^k/ō!=AOӱZjOkn5NYZ `ldf8Li-X{,|vkσgxsJ-vמ};U|xx0bcx`D0׬2Z>٪(pR6"n؅LsFDɆa{>*uqVHQ ][w*qӈY<(1wyǶ-Q[gy< VfijFɹ&#g)Z}*%G G`wD ٘!U4Na ڇJlyWE6]Ai-"ɈGOC->=אqxO(תƲ]zq<[5xPN skA/\ڸ1yH=*}L12U8/F| f8"Npԋ]Z%[M)Ň@jV 4gC wyǎ!l QNSr䣫o1jhX5%q=R[Fi' 0̄-G V]̬a#iɺK]@-\wy+1a|򸻂#{.CSOe*(a4q P4UšVr5:|6$/YO̿!SC>!l}u˾v}ڕ .v]U⯹}:>Z> ^&KBw[v)&J*Uy.7JUyj՝Pzwx^ *׳US4ܮiZҗ>cbuٛb|*vK!Մ!d|s+s/2y $ͱ䍻{S5Sy&ߌ" (kAv]|Xa *Pv긧\xyѩ/.4Jtg:[yq<7h,]yF-f͙Χ֬m>- DAMSj3j}F'E,Ei yKJ%9T(@21'?f(;x< ]rڱb; K2\E~,C`fgK_[9GP:^Ρb&򨟆 1-{5j.?o-:ګGriPɪ2U8i^8e >Wu, #6c)q?&i>&c,Dgh%o*v^,/'EG&5X9ŚJ4&L"07hy+"fӶ|Q4"NS6l3bmbtVhQ^\04"rjueyqMvJ|l~rt_qHKwGG+׭~\cnE$FUC¾7zr$8:NUɘˎ{F~ۨdV&u_nv_GjѩzO>}'N-ΏOod C׍nqv䈃~'Y\7r|/;^׸sl3 mU]w wW3u_Nskr]xwrʇvò&wq ]v/mL攭oֹ]ڧbGy<(J]iP| C!W]/I߿_5v}w=m,f cū^]jze]3cHb/bv2{/ ٱ'`TtmuX$,Xuj*6{܆,Z=quqVXF`V4/$F 3qFud=Qu;ւ>Hg*vnf2/Ƥ %n@9S4 |1|`iƪ]5$_{eu)k1u-9 ULxģᏏE[*>tJ{ 1xO݆ռaF(T8Mn$XcAMtG:zHs`u6b=bt$¡QqrOv~rݾe%fTv]J+J9R(ih'"J[is\kRJ5kn]p5K%lSt=dtQ &<62Bl ]w9 } sRbHFq!׵ƺ'0udAGK!88VwͲK--o'CF.b5juѡt)$2i,I9^XTj0< θi4Cuv#2Rr^A4&JƶpcyJc(b@ې:+͵-)hYg⊲NEX,).g}b2ndeF;]<IrfalALhZ4!' Lc+mȲNC@>$@L VD*""J"%J~D*mqyu_խ{ϭss!2i +EGCE$r 3P;o'50g ˕3Ve ¼5jiY03u*nF)K<";[ct# naK-0&lJ%8TRp 3T l2@f$}@8V @PP{TtgԌQ"!\`#)#m5U \^{io4Vʶ-z-h,GGeilG7˙3V κ^Ko+fUy$' *ƀ/=[RZL'Q!&x?uia;L]*4WJf=fղB$q`<f4x1u@KJ{~:2UWѶg -VUz|ᘌ$;`-/ի(#$bB2He(T. U e:Xe1FƁSPHm( AJt+Sx$[ -x:cQ^,TG>ON_żӬ(adNI.+II|'ӷ+F/~ )XWN$Jhƚva\c,o+,4LEը RhWT*m-=+Vc-B-Ia.yhM:[P1si(Tavc-Anu8i:^nv:*V-#%" 覫2h&x ѹa0`u˿4,E4Q]n I<5#e`96Slhh{3wm &% xKdM:39\PnxВu~QD[DKhLwzB # Rlj(YZ# RЃ]cFokŰn$>/W؊"D W}\N.v)E^gje~bաC:DrA-Zf7ַ'5tYbIv ?0W?xYXi8>FY \{qxi)Z D8} X?&*q98;?8=Ր'*т×v47T^Bj(h72."׶Zv8\G6{K8b:hĖģ>2,ΐtn%PDΧli253XlKE+N5 xC2X sO&\;qCq_q4Y]za7a?CFAǙY> (G]޹v%CSk4^q?Wje")*mQ G)H^2UW45f`j2b4HwQp嚊J  ѠpB'uR5b{Lduc1#QG,d7rǏ`WV+ {̲(ϗ/gFo+l0y߭єZN "ૺX!m NB*_?mk'WtK</wx.;zzx4ڪ`׷mn ?i{dYΑNij>"~A۹:|0?rEy`("NwY)_~:87 ^G붮ږy,*m/+sie>WGAZ\bStmVvͭx-itO}߾׏½zg-OYIHW[M0lϟd|-\i?tu_Meҩ]}wh>oyKܮ|qBئ .f(.i_[#]DX'ɧ<:)&.a)B'*e)M;\Z}S>{fEr4&]@߾HpMXkJSעY,}>D{/D.;{/{馛tXE""?#_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_%_}"9iIϠۗo3׈}mzE|a%'>B߶!L"$G)a v̆2`D(l*x3v6 vE;vAZUebwmLٍ5 Jw!9K[9l˅;<FGZ}vWߨ |^z@m mqe#ii\M|M'49;]Jeq޿) kTtT]w/S~ |l^ߎ'[M#׻R9Q \*F X~NliIӵL].QLGo/q^/ҼWDѶ㶅f!v\ɖ8TvCZpw=JJ"n3nJ(E΄ſי'-'=ܥOzx'ȵ2$4ձEMTViUl;TV\6Cۍ^i(Mӛ97K &NVNx{o ="Noodn!Up/ lJg*[{nRm]()2/csyfUIƼ/GP4y{ %Y:\M'w?r j (ByR}sORqfIS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4HS4}Ҕlo4`^S VZCRQSJ̈́ޕx{dmkwoQz%DhRC$SCQ ?O'+02}'`ۤp_/+vh˂@bP z-iwkFs+=Z}8#[7snf#B/DߠnfFwo [SCnf!Dq?8pa8_55kԸBy.隝+`]&vSJC[IWlbD6'9͉lNds"ٜD6'9͉lNds"ٜD6'9͉lNds"ٜD6'9͉lNds"ٜD6'9͉lNds"ٜD6'9͉lNdK6ר\ތedbs(ǓpvRfraJCԼ|E 8^Īy'f#dv*bv>xfW\:Ss7 'A )|řR62$" ͜ݚi >~!NdÓѻWo!>vãg7ЅGٛg֞93>[{vX*h0Q( 2I x^gp{s ވy|n!gq-5EG־ }U?$c3kȘZde ˙Q1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1A1AGczn>ZovtMhbp@7pߩ<tO X)' VnK0!.Z/q+Fm c8~=t{~Ҟ/4p`\M?S_ oG''m| WB.Ծ4͓\h UiXdjA9} v(6?CB+6sۗi6Y G?m[u{Mj:/ b?peSar0/(\`G*\%\ZCWJn)\=p奴k*PATGU#XN wpa3fӃN:m)pWeO_~?+=E ݚ.҅^h]W]w]kw&E?*%C?fexopŰK?t04\'O6~b2C O%ʸzF r.H{΋܏:oUڀv]|>O=OMQh֜-Ez[ f?i\U^x$ӯCD;x^f/NOkd4jeJO᠛vP7wŖ5[9ʱ7xب% dcaE09 6Z=b0.FA!JƠ]5~v0 AhYSc(|pTBb /Ȩ5G55g2iH9pVT|#.aqxI^ËV5'|R/5q M'evqou>Ϗ3 bOⲓqƲ\3qƞ3fV\BNSמLJ+:vR 3+Krr+7,&ѧW(GHMwWVC[ѧijGq]&dO (8`ggњ>%B`'WY\N8\ !\|\\O \ei=vR3+V޲?\ѓ,WYZF7p J0?!B UWS,:zg0KiWi  ,:ⲓYjҊ,jY՛աW̰V35пʊZvX*oU)d6*͗O:d*tIrTvQH+,_#XŧivQj!͒MMMN ˸o7`x= ZT(`nrVo/~1ks|N\$ 1NqNUb^LW(iŌ}I_[p=hs~]d\ԛ}N`~/&;Yo^LaT{Ò]ĺ%T!Q]G"bE*^:>\BtVќ X6a)6:~(@c]:E#Ly*5QƉ5^J/a.d e^~IGK$V(c$ry~ɜ^-EYI,Jǥ+i KYϧw>F^UXiy'OLώuz8]/.L?j Y\\`W;7s{v͒tBS |h4oJOn3\2`_[|pn/eD7Ig^i\hpUP)(ʃG,^TG\) >g&N$c hJ5Z84P\f B-(KBŽ8 pdu:"0pJ4FW`&0kL``ā!?gׯo55mcHB$ 250"@$h!$\c@Pɉ݌1z 9Ga0S` yc G-ch!!'Ԣ-%RXCAˋ(Lb:̈ /d2 z4ϝ!*ЖtZb`93d=g) IZMkIޅl no՟wfw5r)D|lUF "qF :9=aPNـ]h937ޜWgе8B; Vy֋+Wsq vH'|ʭnzU>{秾kCڟ5p%L{ {iy%WQ=VOa5M2* ieӆK2ӌJU'菭6MnU]-~Xx5_N/^&gk.r_;\\V}oHPz?O2.^_޼s{)P"kr}MWa|c5 1B_b`<{ݛYGuv:ZY#7պhd3KT*dZMh+rSrc%9r^3 }|5ߺtYk%L@-׋8H ?;V]TTϪ^u:K4|7MէWo|z߯?_p\p:(:X_Zw?PjoV57bmv^g7bd9~D`K{s>SZ^s@IJq&v#Ig'y% Ť2,pzN4INTEH"IsiV \z8E{hyjU2uF\e{ ̴lt w|ÄAN"H:"yp̀JM (& #nFY##hN󩃪qc( Dh,@{ЂO[1 B2*ZFڮ/MV8z˘G]xri$ps@l}^k Jrj~;DA~Wc^Koɲ[V_P=%\W|Mm{7fOżf{o^wf-(z~vW{" }Թ_oo?tFU\%ws6ׯ%Vrf-ɶaZOĴK1u1-WbMEP&SSxdĈ^& z|Q_D)˳FcF{(΅L h)Gw&R ,K!]8^k-[W8xKHN@XJ w$ yx,c1-ZޡIӗk^LuǗSe{}AyR5#)4A4I% q 6iyZDeK`i,6˝,6Nɳ6>cD yA3!TkA$RY] V;s$9g hq@֖̀pKbF)Ymlig~TNEQ%XÈ2w`]\&'KBbT%*$oWj?eN'Tߘ@&p'9C!Ie- e\Ȣ4$輵 'j R dZ7T6cRIOߎ;,ɢW2ξ8Āv\K⊀4R}dZ’jӉ$,i3Xf[(82+{ lY!>#@| 5zMKc([Ar@+ ֥"fUlaGhL @# 1=h$$Q4q Icǜ?9MC^u:5nwz Gb[~2 HhM(D)a/wePk?~5@!/<"_(a+c{@dX+vf G9eEj^ [D@ʴ w$1!pHœ eV^LJj 1 ( 9 ^fZsUHΔt%JQ0ΒV3MbL͕pwԻ^C8D=_׽2u ˿vc;bun~Y@ȲY7&!a4$8Ej)ezq#1Zw޾EbښҙذJ5?IbWbƿFͭLIrQdC K!/NHCm/Bڥ>En+?nl".hсqc?']|߫ d?1a:F%5q8Ÿȩ% x7yrP2&4RO]+RRƹt[e>|^#CKERBHZcLH{_L+Mxb&XA!ۨD丒 +"ϸߜ]Nnj9sQ+~|4 @XD-Y;żr:?@}4 xFEhTHWxbi!$ AGnH'E+ / (I(]qt}6pU?V}Uԣj<)Q.ǿzr*Apdbƣnu\q(8z?;>W24A}ZZwR7fe/>xZ's zMm'7*ooZRtlr `0\6#~Hٹ {]*2ç (^uݲ *wQeY4u2&%%U}\EQ==ƫAь¸?-PyE?Qy1`rͲL` !H\q~YJwFdu?o詫aBBɧ#+MQ(r OEsO ʆ $eů.${T3%=LTr'7Kp/~ˏЀ?g`VgQ_M^{VV٥w.%$GP0`JFy0cx<9=JTc&4/Tc'sO͐ fz :8KJB"ޣ^?s~9oS2ޫ* { 60xiu+$rE {@1]L:%K9$pI"M#I,"̤{A NY˩qaE :Kmӱͷ) p*B[ EdQ!q s$^0+%Kv}'ѹv.ʸ4BbMcAsR.-s^x:"{D8y!D"Caj3jX$"2b 5MMJO3dkΎ/ħQ_T/f<&֐Ī5Kwp׼F] ejVԙ<얪UWISZ689Sl۵VW"4D6WjM q1qqo%.tstٖ\=4-5c1T{^tDC;x6?o[5azӟ½_N{nqiAe`a=Оx֞ƽo7> їQ2D$e`.bgB. dH琱 ӈj 0dBRjmmeZگ̇I}y>L628#teؿ Ng`P" "*Dmv& 6Fjy {H#s@/ 5"jH[kw=[XZ|G;//_݉}P-#/x>̓ti\zJ+QL`.l{?aۘv6F'%T;kSKLjbS(.:T&jc;L>WGCR0reqcWI;bK(F[ :aFhNP7[sv hpL{@hFgycf'q`i G@ZyeӐ%gR<^!<⾫gEe ws.`9VNƣA@fdXFH(򥖜N)bxЄNh5&|v.O4m_ CQXF-2*C̦x68R$#Y@SR[,} *!wWU vsL O >] |6*/ð@4^Ь!78߽={mx1kSW&[gspZw3Y{=N[t; k_ִbm.SiGR锪wo aFtr4:Ktc\gBX L 2`K.-('1beP28ڎrU[z <$*8ʝ7\ NŘ8P=tHbaJ;H}k9 B}43/ ӛ>E Qz "ި,I4I.Kl0x0EIe 0-DѤDY;^^ =jSwwSG`gЈ"B%B@nS,g0DJa)3Duwr 8M(rneZGEdLprbrX+H!5gm:myg7Y{Ko||\f#M[llI>D}ܲsQ XG? @zU7%&XzM S0;:Lg:Cv{CIQ;ʼ#İ({B,`B!&0˻󊮁/#.?fgu`yɄs1"* CINSI3!E `VP(@ IM QCv0H$*ƔJ Hyfk Z VrCO^z8Vr?K2q|Iv#&2 "#"u)1Ϝ](֢/b\>jK ^Ȍ ^`ysf"Tnd֜Șdl=X(2c!pQsUbkyۓ;fɮ?n2?@iz('FlV:3*1Rx9˓S`E1-׎s\+"`w7B1$cxTH'K&HLBDNܻm5 &桠vPԖQ[v1M= !|5 Ϙio4k gZ}pP0[8#$bٽoe[=^R2eBWш sJ3A)=<%%-m $2džrO!9BrTɢN[jUpJy< GkPʬ #?{Ʊdʀ_6AH~0.6N6 I SfB IV %ArH5%ҞfUOuwUy@(#BXYLJ IDD b FрGk"ҙu洼9ƕC(w =kp iL\Y̱|,< FiԒ[,D#t Lj%T v`>Wuՙ3ˆ}ɪ_vdOs8vKg>8h_{Y,ބ񨼘y1 z  .Dɀ YT0v,UUoSQnRqo}>.OkouV;wML+th1rو4CVR&L#k"k"QK`QE57VIl 3ꄷsHr|cQ#[­# D&@?eڒs)+9?HWcF1K)`)$]`C =v6h!&4eH)T;Bc1"IMrV@:3(F9TEӞ\:lhRT xES3h$5w/v6.;ɎpKvOKD]#Agn|5$z`= IV[4J_IDm[W|z42).t6孊np=ҝwfcqSHÏ |_~W܍wAu~17/twڻJ:kDA]K#XdFx+q>R}y񨒢Z*i$*iB|HRk4x1ԕ,ϓЃ Kj^0dAWϋg*_6@y}}R|p_,0=YϒԵ[KE?լA)OHĵex X"yW3 l$AP_a@Q!TJ]C:@H! ,,}'+8RvA2(f &Wcv7M]/|&%EZ[*~TGϪ ތ7o~8!(CۢѬf7}@ ({^_?=)ć-bC0}YZ)%3ZX 5&zGYB\Ve|~u%Bz$z,kY/.L6\oO2i Q$M;E -4!;1}VtҖIa:"9Vш -u effv1.M>Sz+e=NÄ_V1>|ӪNNO4%ӾΘxKŪf zgL[u/ٷ_`"uȔ^I!ȤCy dhvSXwXڣcbcRGe!Q7G+$kG~f_oȰDc|\>r;DKzU߇![ 0;hUg$WB$u<%Ȋ .mv`rb>-A'sN8=Vq ;喳qku.!a#m#|tV^ҷ3`My*W(mW6_plxt\(c`D0rNYü(geIL|EͭKtwгz:(F,btTIDHN1ck3YR $vDb5">FO>M3r ϵSwX֎څ@?A $V)AKKeBhXT(Sy L,Ӷ~PC _sq0̟?A"ĥBTD:a1Pc#EqKhzY@/I,/'Dkzlqcc^vmt17m6'vTeсmA?6t$v{2fXIZo 1_}"ʀ=5fOfXQg{? &MzWnJ&.MP8T뎽~}c^Qd;ʼ#İ({̘ul0 *RM-B?bD9c8I 0bQcDT@:+(gBTGH;b VP(&H hGEwo>%`QgXɬٚYQyYNчpMNT%{|᯷$~Tw5v#&2 "#"u) QG҂g GkQʝ$b\>jK ^Ȍ ^`<93@*72fkfdUaa6 Ef,՝eb+->|8[=u wo>2bҁt,pZ,ONEPŨ\;q0]D{KtVĦFÐ= JlR!`V$/Ye4).DʌٚxLCAlұ+j̨-;WBH0_ 3D"șV`!4r 4 j,Y"pБ iU3bVA ; QX<093fkf<ŭcWDTQu!Sj!{e%HD<1<8%8Mm8XG!1pƂE)A9SbI̥Tɹ/jfD|\-i3숋pŵA`AhX+I OT WB0&w;c+!5}a68<ٶ%2i=c*]Kp'O1~,MT z3 d˄T p60JLKZXId L; gE=)": .Bh"p%ӖZ-4?8xgb)Z ^uHۥ޺Xu.>uCxqDx4j-WISF"đEc[*ID; 7+4}ͺe>eceU/AOUbٮOdJ%wm#I_!x9C. s&Y ;яj[Yr$9NpUӒ,+Ȕ-' G&)vY_uԶ2ffdz>K5D ֑PZגnmY`5A,0RݒjA[4UΙ WRddl)槾.;m]wjR+\HhiʗtbvF6rhĎ{ya<֊킎!}@&ZP'pC&"\6uAǫ%'S5fm/mvo< ݂ 3{)tdT-˖\4ٸX+u2X4e̵R2UZ*i%VY4aMx&:6L6^~{?Y&A"qN$\Z&8w3(Ffd%9LѳIHw[ y^Xyd2$C="@bp.IT&!`C ]Mg}2H;)='ƍ-?J-i˨ `*ɒ^2&I>$˳е¡/n*=:zt$m ?/FkQ֓y Q$EC%kR@@ɁjWo]R0d3, OaCrkhkl,̪n|{|E?5=`STK êmZFw]mc~0DjDg_}%1~sW/,Fo[UR=H5HEc< NMkCpڪ eR$']}R̵AGK"S}.QTR1yeL;^&$x:gdJǸVe܈88վծ lhOW?K@j4]Aze[Dykh={$'SzP۸}Zղ?P]HetkE-= ujgБjp/"}Њu}ħSMh-ƨ:jkY8U ̤ C;^;ݒ'w+AUJ.siǢ]3M@ߍ:a K !' HWJBt9* SۂCocҾH2G]@c\iAzmu65ڷ F:"4د2^I]j _i;"0Ɔ$8`@**GFr"+OPVY<^DH ]ZY8&tfhNtgMgM>koBFt!xiNV H|Bec1j Ssp$W7$YKTP! -EĠPҨ$e5g!*4!b?f12b:Zq%{;qn7ud항@'IVYS$>Aaf"ƜJl{щ|WwFC'Ż,>߽ x de3-/CrDS %mR&,A4$ Brm.FXJwjg$)A7ެ5AKh5HAX i$GuzrTlݱT(dSs7A/EFoo1B%טK[P QBOf `G 61lgM% GG7(EDMp^ Z6t}JgnNC$>FeCx𙣏%.69 8Lqfh A KohXI2xmBN]#IduΦfϹ4ݥ-VnÓrWmDV@Q稹m;+H&83a~$_ܷFp6Gţ$.(?<{j(մ#IFRIq Fø|0;dL/¬`]TݩIW[2ʦ θ`pxOڡ,?'$V3(t >2#s:(zO7}iBE8sWC0UeCu?W$U}>mgG8k:+;k]ECpiFt)Q2TC9ҼnKZFҬ}$Vt6 D01`(ZEϳ_;Mn{8[GvlYXq>3)+" GbL+We'jx_Z50_/>Ʊ_LgK Xy`'䠓8IdV !B-ӵ#U>rL۪|ze;|}ω}-uk!ݾR︃[P,*=z}3OKrٽKǧn'y[<.s $3OXЫ=J`x.ob[ h.[,:@9R6% :XG&Sp]R&Bt[$F F,1 nɑ,qB43 ^z+td\ǻuMg]Q> ?}ޙm9o!}t>}Nn&3gͪٶ5_LN꼇.|o;MF2dT[ϠU^d4YDa0}6wmsSLƘwڹ,*Tֻ61*jAI9{;g}|<':?dkݱ_Ŧh [슽eH%тI7 j?,uLwc c5 Y;{«US|ţk9}lk?3ZF o:җJnGiy*;@U|KA%L1ˌ$ bXU>NUs !q%Q)NJ;A:l\cWxyf A&C/?VGTOF_5_w%p&/_;\ Ʌ$Y:14+jA¹|GeX[(uUM$@L0ʱF(͑t*\n53ᒙYՅ{ݢ=5C'-/b0"BWZ ƽ2RO/& r2gCI,#GU?3{U7Bc3zueW[kItq [ $K'e1yHVwWWէ>ڨ|=y)2Tm:T85&n[oңrV9TzaZ9T"K% ELEM(09 +t ?_5~,[,c;nݚ<*8U^NԇpU3eBjnHdJ%e imI(Lޥz.ʑek76Kno8L[VDmз\|JM=<V@ ۟MgPTtF*]Ίf Uwiz̮&LIZ")议I:8}(7ǻdQ#)m*:j+A6cEfC=Uyq2sn׈гKhAydzxkF[puL蛗 y]|2y-XI̳ӽЕ/[',鐎yΑ}-:axq~خp$ZA?]:8hZܺ+HmB)-Okl|;:ߖwi qv!79_wᢗl8?fk{nwΫw{m-u4MώO޿勭Ezxb-z+Q>;Og4?;^J{|#xqϼyvdi,,f9 1hB>v>N~F4^2l|>Yv϶] N]zYK6zRsd53׊?_x0koo82^8*.Ïz!lD we5ǿ5Ґަr2~"d/:y-O;xu\Z%aYIa[=t -}nuu_ZEBo88Х)=1Y/w]`,kz [lp,ƆzG+#y@掹-#tyͫ"ں1|63-]fabxᅥcKʄJZЛtAdeUYӂNGLKȤM2r9dOK/awӎ?==n}E-Xmӝ빹&9Փ,BD]=6^䘬VLv$6F%6{P*ldhZԘDT 6"-阋JՊkUEuw?:*h0c_n FkM R`ܔRkP[%k`DiL ͨw{u6;Vbs/NOJ;߭بm0YZl&&MjB1'Zj/fBuؕr mq}sQ Eg)b/9%AG9z&lQUVG,+ݗ1+zr}mw&s7 *)sP!4<a1N{'l~%B! Lש#L3hh?v"TjOt#X /ψ1'B&4X)< iVE8USI{k%hH%%Ɋ] sr=ʜ%xnuAe S)NdmDiivJcG7$ԓŢKS}h5SeO])iDw"R5V@&Y)dWH! QS( ٥fّ#"L3R/3ޅt8雌 ָh)4" ^TTlPtA[ xiЮmtW*zL ,LnBWˮRW ڢY cM# ]cb"uh@ ]k05Z0+3\i!. FU5z$i}!s&:BEӡUٻVLU:zREPOilG?!n`KU * "VjVa ,3*D68+(=c48pqJcxTFcAOuJ+j+om\8:27A a q2F[Q)b/Fc1gGܤ ݄#Bl `L1R"%8TRp 3k'T bXrPf̤cB8*VZ hM e*3[!(.0͑`WfҞ5#xGx)J6NՆ?5WՕHދ0j{RQR1Lcr^5 5$DjeT"Z8J(46A j"| C$CuH*мGwU+czh2& ԙyoqs-]b_*"u44U !ؼ($0D!1M7fͰ/֍ olJ6#+J$W{HT*#d`!&SQd֣2:/3(Z RI"i Yk2a(qcMhx }Y%H Vx$ۑ ox&cQI,TGW>/A_ż d-Š.kI|ǍU69 R{ȋI /ѡB-c`bYwQ@R@"Pc*2{(%'6Cܖ!c .IҎyh@Nj- )-f1c &$BsogcQ0%Zk?n,L$&8HwXT*L*BȲ$P>fl~)PNX)j^ a`I{H$Q5: EĒHxnQ ZMnUrDu52h>i Q4/^k:hIYT&Ɛ%&Rqc=A;wfgެ/=nmsU&cfFP]!lJ''44z63 ݴ໓("bjЭ(ךB6"|IeFI*väDy آsE6G=Tʍڪ-h'Df1uAAtP%tШ3ES'2(XoڬGŰhET.;lE^QH"NdviP'7 h!g0Eʟt7/V1Z8RTFjt01xRuJc-й'`ҕȪTqQc@sjo6i]0s0J@ZXf=ؤ=Cɗ fY7/Q4T Z W-ECpsrgӮA^ Qђ!lnj-*awU\@  '` ae+J ̀| =ʤ辐fhJnÌ Gk|Tp('=xfc I}6ՔFDLƈ>&1ˡ옅GPVRp*zN %:)jI-JK\uhx;SnB0 ՠ_n/|7xZRg03P`3DT9됈r<=͛ QTf>EE9C Nb]kڰ7{F]4oM 4C\ kmI X%@# d L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @p #"hGCI X.I (@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L}$ (R LF<ZՓ@&G("L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @>&GxHh kI X@! haI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI˭[CjJ׷꯿vbV^57OJmVߑy6{tQ:>!GÆmtw,&é5T{5G]WZo˾@gwi #M:^!QeSiNHڽ$aZDk#-f@|@XqLØKOѪ R4}h޹>ֿ_<޽[=iNpOӄܓ~HA ޓc[Y}*0uU}^#[ʢi*U*|.t8V?n׬vvyo) /<\X'Z Qɇ#kjR#}=_NopN|K*]\yhaK=8g׿v>؄]˟޾ϼ'949uvy1!٬iޗ},\Cs ;hi-XZm6$ 5L1iP$ϔˮa̓]%FWVku`3~_'WX/eߋC|3oNmޕQ/oz~0f~>@/O.1S/\H.kxo6GmT(R^CmtO]ThORvH:YZ/\H64\Wטm]s_؜g{8WPH~?rl`sD0)q͗_ՐCX4livWwWU e~awpa66]*6c 0xKސjMW>7_vU{F;eU9JXR`ǃ(8Hԯ-ܽ@#z|jEgJ7\Hь^,*AQ| i x3'@$4iC{i?W+ƭN2n#a/-?43s),M"~e)Yo `t.%x'&> IfyvGϵ4"HB,:P`H B4 y&IaK;*g@z5?Ppi]ەaBwԤL:x8pw%p˘ )&Ԃ-xZzrv^a,F 6Gbb/1(pC ?tltqp&ઊr%$Wy~;X:x>ŽqjT./Vx<6iM'+*|,S?Px./cOdrŽ8|x<%;N][?qo W.K/u ڤ 6 Ua@:rQOfl1:b~N>b*E2%7o*Mtzz2 Ork`'8T _5ɨbѵQ489>@٦>+V4/]̊ZmS<{Cb28[Cnb6ތ Z養?M7inI-aH2A?t "~M^gsJ5VN6Wpq QedvyVm0c_szL eSFOoq뷃ްx}Kfv O]?bȣW 7+a^:T) 8.vE /@H݇w~?|w(3~&e,WǍ$Gi?ݣiV^47bmfvU7zR!0À@Sjyuĉec&e'')QN ChBcʰL9,yJZLzv(ϚKRHLr)+c7.D*sL:tAԲuSqwn yÄLb)&΃c&y$ń6'GnφuZ9tH5+{ dǝk\C⼨ cNwf9Ŀth}GfoqJ W#TӡAw7Aԓ:B{k{.ԓ;C _I9Z֊W]2)9Ϊ*1.ˀ)~<Ƚ/}Qvt[cn#_wQj ;M*6gⷵYƯvGCFiD~>m@6>j=4Yu(cuu{r_%W0vc}*׮.R@9\ `E EmJ wo/MS??WDVvk <%Xu&+PS‹䃐X]'`ݍdd#VM!4%I/çnwuTwɻaAw.˯*|[X[ZVy= BZN$DՊ-9Dc,2һ,Lˌ?8DOC4+* 6̞%srJB,$ DY q8kgCiWrK'׀Gi;Z*OW⽎yLp۰DLa1nFOx$[DzK`9Hnk[!$ g'a|^Lj8=A3!9(5ׂZ',Y:\@rrF5rv;;z_< L4].P.AQRA?V!{ zSthS.+ aJC\q1>R%eǔpQE2i -P5&}bq^[Fd+O4;y$q-c+$#.( K Aq8-5Bܒ!2iG$DY K˖eH2GKLx i)Q)M mmК%Lxo|b kYx;5Z$ IEGn@1#HH&ELJ'Iģ=X2 "qC,5nNǸ=HY)EW HIpp6y!gP>[ʤ}Z^[pHIY / Q҃+%Lçde³W,@ų[0Pr=r@8U;޾ޯuh؅rVΝ-ۭ|l-Η1yli x; d(K)4e;ۊۗ `Z{S E*eВVOKo,}gWZM.^MU z"B b" Ak]?X1Oqq>o4bOIAoEǘ0Q#8Ÿ %I4Q'%#[ň\J8.pT1EFY !#zɁCfkǴ҄1bklLj 7h8(Ѱ\*1)ӳ#^u|J78_F l=wb K1{.\37Nn=L0e1hTHWbrRHJKSIܐo:UroGHQqtV?KS(AW(.w Nfe{n*W8/{|z0`2Ж.7|VlEyJB3 1DEn ,TgUl٫~Wba f脓1&l AX?㪖ը0W {M@ m FTG<.|vf ⛋SJRҶ:< ba<[RwFe>K5g8u]B88!V dEڄpwO&-YP~39!1]K">{B]6K-QU4CzlN 3s D!~w?|9U)I5R[iꦄw}=\9:(`߃S(?ArN[ +3E\P?޿' 3>ncT"IU*z Z W :xJy9Lmq|Y(k>iX S89@m @a"2%u4KO*F1C@NXMWΦAS%E 3bHE&G{Mx/ TaE %K󉭁ͷITHIm|&-T' 0LGħH 3:;-Kev˃='}M^9b>d1DiD=(cHA;HYK|ڃ푌,5CA!k qTmk"gcA $JBeQz.>('zpPD)UjM)$1[ I28\P.*vJS˅9RmD&A0^PK. c*))t)E/(18-"Q _)r]rU06t;"#/u; Uhyx"^@`\Zí;Td`0kAz%Vu|R|޽pRy ^!v%r#iA*]Ucxd5{+t crVzs0JeB8UP[wvީQKtdJn"m~z.g Ҿ?||m).'͵I֐*KL:r*EH.$OADDPq-6D?h':(עKɩ(T@1mFi^K%Dz| B?hsҼSz9$.v cuktؾLnWx^#h) :'Iw"zOr .z,fDc;P<(hrQ4ϗywJ4i{C`y 67]`AH2U[nɒmٲLDx,wbYu VFhJ=3"P 8Ehǜ3t>"˴Z¸`9>DŽDgФo':0g%CCUAe7Wp}% JN/.P~]" Njf%xIy<՗,M^x8Z|8E7o65ܤ*J 5Ji^q -`ow1˾K͠遼ޑ槶֙s72?tQCh}򝛶׳_*ꭓ?vZ1Dl qp>l2y-mO_p9VivZ{6t gA=NTDy'DV  d5ów~2;E12c50ouXռNۗtOg$LF|R-&kd՘av!6cą^05C)S}$8~<$QEqkW>Q!,ZjtFEb냒CةĹS]s_hd>gHyLKQA(WKR`+ hvg Adĵryqe}?NkݶXP}^" p:9N)G-X+@K'U53יγ$s=G:r> .=P-+iZ3x"RIzeF9=@d*xZy.S=RRƹt[$)N$J" .#(:cBbkaZiSibչt:cΥ0>llҝ >,_@ӽhhpE} ɵsiFmX~ǶAHW hO5Xu9|D?WH. 0DҠB .e (򘌺t2 *+~kęk`*JD0kdF[\s\8uTUF5):Q 2b g@!d8~4XN͸Ơ1%xQŪsDg#,E|-!c5>._'K߆5 {K1qj#5N6(J F*]74Ud&H$LL=JZ( yOa2 UO {^C`RNGPgRJf,V5c9RL]u* Q2jI3xCJ9ɂgM.x#@ !hE5$(B: /8){Q΅aܯ[~A'hbF459jģF3V)#%"' p[s@p&Hem /m(6)M9hR3G-JE5&i*B/_:kĖyԋ͂(:IɎzqأ^Gxԋw&=$Q'roB1)9 Q2zqzPa18}x>܃ c4o{w_1 f(Z-ޏ*~~mZ,z7tօ6ө>fgZ>EQ3'?s!9Bp%J$`=&ypD|in2L\&5`9w%}XbY3 dDhKF<4! 4ELj깎"AЄvOu{T LP^Svn9ڮd2Wbl ?x>Χ%  XHjx6ē!oNhRTՏg6kF~)e6~y>jYE;8ȡqZeۼ8+YY+㞋񈎞 is|8ʘDo&U'8b':8Į_j/8vID!cƹ~y|u1,8e mPlW6h>Cߪp:ſzHWm1mzXVz]j:C{_o@.VwqiP}曪;@J3*|*vpOTq.,:M~G6UKO7͗ʂAy6!#26 MGHup B?jdx^dmDlBxa`LfpB xCLBMg 2P&,P)4RTbDKpփpB('ҁnm?t?Ҟ wgIx]mo#7+}k4/edE,9gbŒ-2%v'QS]U|z*MrR3 8_gl;y[%{S؁zL 0D-4FD(2Ѝ5J&bIYc4`^3v XÝTB#։lP`W<uD}ݗ@ӡ: q)Y> /[»w]~6)mN>0sxa$Ժͬi:G-.lvYalRˆZ.xvv~}+1Lz^iYn -n%y߉|M=ox|/{:+wh.ntsۢXmOûa錄U FcqN=S吭ߩOSWHNU&辪93ԑ;U1R(ɐ3dQ̄3,IH,%y.JEgE~gVg~n3jKWjr/Ӻi͛,T 'gD=LȅCV,=IEBIǢ&qI{pPda  {1 ,fSKfL{+<&;㽵ZlOQٮk_#{h\`ñg1:ߤt〕a|o`+ӄ28sM%͇ϗxl/>9Gcv8]k`ۗO`%qÂ`xpUhPl󡴓1j@~1+&]/Ņ烯۠A\n-0Aۏؙ%؇_]B;0-!h$OL3KJqɤ޴FV֧|7bNAyDz_:3v: I{4c{ßqxY {4u]BiT^_Ϯf~(KٷK߷wgEg˫O/Kfo]oمfa:ަ?2>M_?ߜ9`Z4A5*9xq;φBOw>aת-cG4R짶- @8&~tt-Q8rN_hz9<|CZfoz|U$:./Uvx?JL#!Uz5b9kPގ :u,$O)+ uvq6qgq?\N+O m9gy5qwSMlnG_wL̄aܮoI'xPńt!7/[rtylĜ ALABL :'##(, $kȵaIF+3_jvL_M"ݠ~uq۪W}~RBӇ/@Fp)@ GS:F{ d< MA4,Ȃr}FscUǮ:&Vue9TBfA:UȤu@o,$/ xUH'BJQ:"ZRe! Ji y9Џ0`̵!]lt)U[xsPh =er}`Xgj)8 &27>D'IZKzNh}S.p Jp~sQBxMbJ.)<.8 ga@`ꗉ0cS>861/1@5Fi@,cEp+[7nhŌA@&ךW1y@!c0,Mѵ]QM ~~@| oWgtWVc4(g1%}9R5;Gq<]M:ڦ6MW"D09K1dr"8^0k0\LȐ+Q&C&& vd b.^T'se{Xm8a/La<XM>EE-F ZbhoS>Fz k"$Mp^00|㤕.2%r3G[SߜO(Mk5MH%1fL6!'rתVblkM(%L,uJK}YC:G*G/V={YQ'mP7i94 837jNU5Ϩ* 8\3ׅ|r/a]ub'΋urS׋!Dž?HUHs 1;FNi >186֐-+;S85fSMwa^ȅ10vx޵o4eK 6Z_Ls'r㼵RL7!YPZYq=})ر(ر)(p(sG鐣Lr4g\w QٗlK3g9˪x^Z92u%=2$\,2l!:I`CZ,N´^ 1ҖG-`*ɒ^ d0O<*[/Ƕuq #'CG_hImLƷ7fzs=\4hhHQZ'a{ $ݡv74mNd}jkuZ^F^ÆSWnççt_ɏ^ckC ؖǼF)+8+)a0X4Mrkr8<ߍ:@Ξ}%DĻaRVgj ZYFƥT/4˃A*20-Ԝ5oJeT53\I^&q5AGK"L=zS^/zKghW4F r߄քb"Ιke1Z:g̱YYWVt98}C Hfެ]?7U Cz2^?$9q Ǝ F*gd|ĠB7z;\ ?h|괢I^$9n]뙱duꃵp'B`XXT:̩d1Gk:yҖs]Vf!7F)0/9MR& 1'2A<ɕ) @ gs'sIk{~2;@@/5e;CÃW>JSn+u|sb=Dr}2م_ԙo;*ⲣ[Y@*h31qZ"%teI5IH$*El4yhf4׳!Hcl`h " #NVg4;Y{xdy4B p 'RR0p&c(lS2 $ɼpvԳVxj bcNtr#'KN9Y$43398 o2 HU Vnw/,I-GˠK(R!R(Y@LO1sBZFP_~Ӯ #+[|H<*0$I %-y9HaE"b8&iz{}*db/FPY&#貿gAxJ%Z J:@K٨.XhF\ē국ގ&5.ɔ&"*|b|;'kJd" 2Am,L5RKOyqigdJ $J8" 6y`mW/?+HhViQ|AҴA,h=(wX*8!FeP:He!ik]by] oG+J;RwsMppCGĘ"e[YCE4hF"Q^*IX`'.RA"ݗ:>"! eAQ'$"Xe BJ" 8Nuȹa7ݦ#VxrDV@Q稹lOC߮ y\DF0d K% H 66F'X3ZE/%p[AܭM7+FIp F Cq`ھo< fBLd[KZYt1 AȌJzAn: Yòp0)s/fC0UU#2,̾u1ԋѥtT5pA9+hُ7?pibhcdqN>>yiW`:^׍UcmJ%HXpˣO_*'|Widʇs>^\pB~< }kbAr67g+,A+p|74fV?C U썎iw_}u_9Wݛ{`,kIw&;`~UW]S󮅕7u9لߠ_+~ܯu-ǓԲKv@`uB܄ KH&דVvbIA+&a<`)-d&4̳ƘuڬR1qPI}0g.[q2?trW0N0Dۡу-TNŷ)Qy&I8 `} ήӉ]g\_;[O6֝Mgiagڹ<z 82#ԟNx&Y-N׎`W] LB]EBZ|vw opͲa^;.#vy/ z$6׳HBV兔3~A"7+ d&hKe;Hu?EfoOlf잜oE=T; OP)ie歑F LTVj-َj&Ic:=JqB(IJ*߆LqYK"wژ!Cц uF9%5˴RϢ1_S鬩3ɞ.%֌'yc_{gD\v;l@fryܘj:93yn~Wp7Ȅ&Y Z[%E!A#E'c9:q itkJ>!EY{Ի떴Gemr#/a)nDB3fԉqb(Pu6U{Sj\*Ч$IoiNkJY㇪ۧ\vTibɐ%~pLq@@hLç 'R,nno4p&/ę#}gs\U  .ˁO }J]A0xî21*P+cgmߠk BOIf2Vq~]2WK/~B>}'-+ፂB޽?X _wDjߪtҮܫ KsW2XqA+;GAٗI{Q@ |4s (=zoiz 9/$]y"z@LYR@X262ҥRT<,cNryUʁsgۼlz{ {ԑhƤ-3[tcK(Rv<~c|Z^U) ?/s#_IzLm^ ;\7'ť!6sGٓ'/+#pۖadؿ-Ò0F(do%>pzP 'e6?AZedw)H.B>G Rr& 5h;sàkY%w(a-G_Qޠ7yu[P#u8̋8=Z 4Bp%B-|nX.5"l}V|J!H 6L1iv 튜-'XJ9h1kQ_urn}r]+'ִprw1Z5 2.5QZ6#\0A,?rֶI}̗ഫ_)Wj8;9K\yﵗL*{\y~9yԃGFgH@` #jQՕ֝Q1RbYȴGB$>eHG*ݖHHHVHu CMJQEWt()+&aQL Sx+ L@CT-rIjS4%] #4"[(8Eea.{&ugMi e_6Fq'od<]uVK.4t_nvِJ&h@!2ZrM)eTHbKE5+m?7W)goNR˷-⿕x*fpzef7g̿{5R=!U_x_T27}}\.3ڽ^q )Ķw&K7缿ZkANNE}qVR ك*LSބ CVSp9^nIrqr 01o hw B_~cG7ȵx Ѵɺꚨ%b"Uu\ V/yщ+gyU۲a_VʾޙLV؍Bt#{E@9 #J#dRw Cd@aJeY 5}֑R#Tsb4:`QIzoY#MЭ/g4}T:`2((vtho']tV42?}&˜<2n8\PFkmJ,+IDHʓ/ mXbXb%nin{ 3ӳv*DK6!mș ۠B-}^b CW= z!YBEG^(p6g^ Y'ÛO ]Z+c!/52VV|HR " k?STg ,S>Uݷ"3C~`'Bp2Յ`rY@rczPFfY)b21jR)9B z p_po۝3cUIfͨ,E9BXe%JZp@ApLd[6=&mzd%ߝB#fDR֊#+Wxx_Pb(Mvx}t54$靶7#};?RZEoMDdS僞o^e fjM"p6Y`ږ("[#m b5.Q;O=VbMB6ճvq|Rk թg}ҕ.9!aG6y  IVEW퉀آ9HFTI6Xf- #ֿ-lw[2yBP$TkfBBbNREc&.L!5]r>*tJlIl'^Tar=L?~ta~s)ޯt.U(*iiTȓ:3t ꤌ)"*Ł~L8?Oзs %! A1P8h0 t]t7*E4Rʚp>L ͈w3Tocs:C2%f,ѣGg' NCMֺ٣N9ZE(IW͕g]L/kk7Mēف%1_t8|ppx4[/ nӣ0Z F+ZbuK,oxU3V6sYX^"f;#XGzhq>Г6ǃڕCjuճ*pY'u'u6lX|>>M_3r_66ϥ75v iaq*ÓQ`jEn.NA_ ꊪr_ixƧӤIfXf]kQ-K?8MO7iprJ/^?{/ח_?y?}~{^(_{.~C>i}FӦyu}>lhש>yEޮ-ܙ1feguj%L0#]&> '{;&w5Bi2х\XE(b"iT'} q'I-vLRcJ\Sy*_QYya}n]u&u՝s+agIX< ub^;pCuH"-!I_,hk18m(DѺe [+ 1R⸣MCG4Pj ' }%,vV:Kk \uy]sÛGneAXz|kд5A]L taGea%KNUHIk<щLi&o~ws}O8B7or8S~r>l:oBz{>z1kKKA }5'==K_5]@Pcᮠn;.rnwne40 J!;)l,a\ JO#k՛تֿq~'frp;T&SYM[:gxYhAR:OKWIF佮58j ) x-hQS̱$ԄmT<"v?58ʐ߹XjɢCpIhTH0KbddRT ^ܙ$lN'rFD"t1IW/kmTW%r(&K ћN_նn]ǂ©6bNٸ%0`\Ԕ0W|Qf'(;";)!K7~ۺ'^$Z0/#f!Ki!җrheD*F+uºVSPmTZ+ "ou֖rъ& iOƥ737LxQM(ELXF8mDқDP6嬑32Wvψk"(9)(AːI޺s{`6tMBQ򰥦ϭxʦv^QyZ^|N_b7Vu1\+Ȓf%X*'#(Z V&R R@[]#.&-˗܂ovƠ'2,+Qvotդz7y49^HymG;죈SunՌb2FHE @:;stOHm{rԶef{䊾wx9hĜA@rf ?{OƱ_&x9A6k}ʌ)a[9HҐCi$SPD4{C-$ ȢIv#R1G TWIXqGkZ ?D hZG :"H>m(CM]Sh٣k^3^vEW"Q.U͛Gb~"1]ITż4 1ѵz yut8_LwČ"rq0 O9rS|0O}RO֫G~(C%ic(S?& JXKZ,%tYJX@VqS0bQG"ʂKM 0+C `Gc3pZn̬qt6p(prMgw7],6a0nۡk]:M9f܍%הK/ F" i C(j$"uTa0p*o6jyvoA1gRcv)~tQ؞n}Ľ EC>5I=u5b_W0:{z|Ϥ1{$,&9+jm?1zɓE7Fd5WY ̓.cgC{&lp{mW#hEm.>Q.F{QlR:]渥mOBTmɻ7Bp 9/ ,P S$-/RGg{TgtC{τ`R{NH)$R”EYIS2&<ﴋC\n.?Xڝ2z.qa27tUέChqxwXRH= +"5.X ,œ2`{ !rE$D/J# :<4cW(Ur n1gv;v 9kv%&޴O.tVߟ QrWVx+x¹Q@ a䜲ykQ x<([ǣZc{Ӎo|/zEO/y>?Rf#B11)8H<;#U ArЎH%O''ƍ_luY#7'O6 aEN YI B )M8ăC ))4ç#!a #bY Rv?uO{ ;җ≁A7Ug.yIT%ASCA jc^{qS[Hrww7nهj~q( >` (Єd 9ťpNS#tBY,2=UUudl< mꇊSxkF-WZǫY٧8C(d?}[\se|~R &as:]PEd}s?|>D-wjX,r FON^e]!5KNW8[H*PͨBkT\Λk#z ,L4y}WYѭ)KS K{nT|`܂}/2Nyg-ʒ;r4 "ש*&JR}Woe_NKGϛ0 ߧ_@ r9yQe-&3n]n$'9t%Da x9^ Gkڅ۟2QKݟ@hUs _ZTLosWMڈiM \_/GV 嘚o("1SQ1Xl3v3v-ܷ9k|cF_ÔccHy`*v3xO:gNU\ IM)n[9<EĖPƽuH ÌМ&俐Gdc|7}pq9U\wblm?5++rA?biKu=د+01뫻U`u\ Ge|('܌|Xԙqfy 08&؂6|m9Lpb$}q-/\+.0'|s.`9/fQ<`]X((u@@1*CZA$5M6;,b+ӧQ8u( S0V=ncR^t ;kdU$H}7+Q Y,gy)?dk:-uC`V)?@P\.@MAͲ2WP+ lΌ6>hw1y/*{>,frd%`3|%gS4a ]  JnuF򩗳?.0BDO6Z^.BAF+dQnU0 HEZ-U`B%a$ 1=fIV  oz?]fgM7rȴ%IZN@g0k7 , `_a ^*,=|VvWEyFɾz9 EqVfr2`(ߚʪES "4i ^˂r; 1T \_?(Lսf"[qR d $jIyƒZG2/Tt򱤣Z:"b6_͒^Tu~1K3+]$OuaW;t0$}] -)Dg_Չ&Xx濂)_fߗtN_imounbxbҶdv1 tK"ਝ -"TlÐ!y^<& AT a} !YP#):XӇ!d`-Xӣv2:OF:haX0- /aYAX*X~bAH7?RjqlXVۏR-lQv+V"=k/[,|eFhCH(fAE+n&ՆS⥢Ya96S&aϵlNWނmeXb(E#8'\k=q =x̢D^wwX5m6t01}?][eZaD]B+C0!UXKzИ ,k1hFs+%ZZEPBNy_U-ȹP!#GɲcDd̩M4PxHZ>̀H@͠cF֑aXj? !=@Ci0%(G K^ASm BL A% `-ʁA`I. pq$VZ!hަV0J892jbP5Ho9(l NB Aꕶr{jse] %ޗThEϨ`#/_PH&AEh8PZ 6j"r $C aUhޣƻ },B\FҤe"4(49Gv׊y)4cMDq| ؼ(D0&Dhw_`V95&^.8 usN ^]-^f$>Hp"P # o d9@X8pi76-hJ)h+J2a1'8; :#/3* <@jZ5`A>xmBV5$ U裱8@= %$ AɣJA&ح.(28pGҾ`P& :Qh%wV4K:^A~?ӈoOaij|LJMA⷟t& 1\>iؑ1 d2$>I#H#>{goWsڦwR[ՓCT_}úI O QgƖx*])q\CopvG]VGGo!|Tg@`>m[sT+-_gIMmu~u.߷yv 'n'#Rx}:< {qOIZGHqd?^OK])Rʔ Q*r'T4s?ZLˮ>KY9m hKۙ=2&EtԆ1ݻk-:u̳ɳ3Ogkҵy8\>l¤k?m,z Oxx2#ڥNheudJt AO&<$:]DuYPTkAQ}Ecs rNl.ɇRr%[.kkrcA.( ś(iOkd~:B{V(h﷟yyrbW_h " Q n3l7mgJuS}TV[>*`?4_ '㔜R} VC3xzێΧ+.}IӀ|@|wרF1/xbHVQaQRȢM-J%}u6*/cysKR):BI:95Pb8pX8wÙ%gCw5sm3PD-SPstTAO]s>7ϾPo\)R&.t|**-WR NvТrL`:^:D[BGs@moNgZ W8)}ےgɓkUYyiaa%`}FۡFWɳԑ,A{w|ͳ/R/"qwfW)'EmMڔKFBg7Pw + zPdqL@d8 aP,t2>)"g;;'f6o+S=Em?=63}oTW)݋X)r]UdoQ0Q3\dd:JV(d0Ci\X<6Q"@Qx!-]B) a܍?F\qX~<`D 2d{>s4"M%Y{烯@Y9Vӏ'j " nD|3pq=ms%E)bd\d\ܗ {2 *ɻBw.{˧3)eME9(Z G%SsaLP>'i^8F9"OnxҦgqKsVܱbEpafżxgx@1ʹQAzpigZ{ȕ_i  f&Y|ȇlnd;aNQtȒcN2߷zVV$2[Rlv=N??ww( 0w|^~{y.֜0i]u%c=cuT,d;hc!*W7NYwF/W)5]4:`H*9zoYLЭɗd>v:`2(QB#qvʗC톰4q5Jʣ*3P膃ع3)..֢TeĒD$H[wS1( b`6,Xb%j&qin{=f4xg0m Jh9/҆ʂ I^,)^HdbPE+lJ 96g=;kYW}j &A_B֪XcK~+C S!%HESDvUK\a;Sο5?yeBakI,TrczFf Z)b2b65~E!d`zZtMwq.9ث P$,XOAY )\WYP#ȴ3 oˢdNf>=.w/|;^+#FVXxx_T PLɑl+Q((Y3*dKRIȈH#JP|Cn9v%Y~hlNYR{CIڡršApAU +Y0Uފ}>vł{H#/n>xt4!R8щ"1bR1%"a 5Gkf;|)״^T4ebAQ"E Q ս #=P#9_ES NJ$rEkej$Bp:SaLgKW$VƬSsҥCjq~4N4ij0OS ʩW٨C:m|Tؒ4?Y'Tar&͟ӟ>M5̏t9Ie_5E-rW yR>ZOK`&eBPbYja28?9e#xOCΎ4KV-o%CbqI8 좻# _W)QWP֊18dMhFPq lxdͮYGN.* ϣV~Z,nr~efN/nޝxS7`8y}:Usfg7kzviߧGw}28[Eg‡ oc bl/%HXFv iX %+VV5V7c`e3+bѸQ*M}ٌ<^\رtxv(m0INH ) [± *ģWѴ(`Ҥ%Y++nU˒-FMa%{=g'Ox-~O_x?SJWv޻ ߫Nޣ4>iW޼i@NӾϻ "y+vۋrgB1݀iguj)LuaxctM|V7Z}M-E6qeC}9LY`QMd޺YB`0YnN]**!eXJhOd)$x_z {5?73'QS0yӴShS7WӎadҴOh|<yH->L٫j Y_h4~ܻvf5{ګ=k^?x8H/9M;ۡ:Ks1\u@?XҧNj/aikԃ:/ aGea%KNUL5c!I'~Uy ĤN&w6~\?ROPyY N?' GQ{xXΆMMHSC]OVG|);aB_jIgygsWM*6TM~Ee &!dG3 2Ki`cp\ JFy6aI3Lͺ/N[GKQ^ dՄ{J>a?)Bn^gV ~l&C9;{sov0oJ{ISo(^B=j~g~7Ƭ~nyxjwYͱV[Ij[sy0cNn2̻] t|T ^ ޸x*`(}d\މQbPHYge8Zց ͙$CTjAlFNW*Anʼnmbe\SZ*'r eEXA%)aB>%FXX Ž)NK`u:a`$p_#*WFHYBWB(^QdBӗj[[]ǂ!⭍.l001`5%+>).'NǶHNhJ҅߶}IZLHHD!=nf"q289Xu T4-Pm-C- įeQdqתHL^2c nfDɪfog$A" 2Ci/;=V ((XRF@:qkZ@dX^UALDd]u5H6B BH%:(X,`1r,h{V޺ US6Ct$Y(zԘX?pz,K1*q,P-1.`37u#WYؽdPJK>Im=쀝^ n+CJ@[N۲V7ٶH"fy3%uya4ZoaDqmu]AeIO-ɅIT+3,!R*ʂbj( } 1n,b[;c*[/_AW0/a_W<t놵9k=D1 QV[QL69s . 4 lUlUu rPJ#rƉD-t \fPh#ђ&]c'+Fc0޳q$Wr #!@8ln 8\Ngkdɐm=3cH mf[2SU]]U]6T[VI2+cZ) b?D hѵ8uD }0"p*][>w;Qs>?5deȸי_|[}b~"K/smk8b*kp9A>EMB\H̸-ұHƹ)GNzk~';~_% o [ dP*͘.XJ@$#q'aR/1"/X((VHAԔQdFQ4`pHHw~vN 5.o?NѸ>  ]ZBwv>77嘆cjWp \S{XzIXmAP!-axa_PG%0 `cZֺۘz_XeAc,6b4wF]؀mQEW/2]W ;xp8-VΞ=e[l¦HhY~"˒{rS$cb oBth0 ïrvݥAGpR{n(r"n1s$Cw`=JsڕIv{<--ٟ讬^qra  ;S0o-*32$&[ ,O7g8GѳR%XlPLb-cJ"b5tgd#H/l#R%O''#_5/eKG6 aE5&Jd &1@ )4> ahx cw.>M6]bŞmt8DehP[)E][v=:ħ=K]񸁫AUg.yIX%ASCA BJI  X+ w4T-+MsV/O9 *rK(UCf{8&XxDW_*bM Ze,M smFOeCL t8:{Ys6$[AO٥'@P¿k+2yh2GE,+] ʆlֻ<@IQf]ԟgƾz .::Di 8QbsŵRO@76nڰ}z@۳ef?<~lCЀG.Kb4{>_4q0:]-exzZCZܚ577ę~12ҬރOLǛ(rj]A8Ĩ?ίA#Uy"/wq\&Gʂ7H㬠Dzp|*LL&E_ٹkeI`9 YXISHb}2/m!aD0W77=]\ddP"arsƭˍ$'"΢(/^<{Wa#x(+%|ρL=z*wWRT^6bA'uVhCc l @dV|@weTX5q8}e⎑xYâd՛Ʈ\>)~,Uq_ zeL^0*rX2W?fЛN SDL@|g*!莳gikD{禎RV,P,Ye`oST MܩhmLzy[GNÕwr jV.39^0U&a|4"1Cl|Կ rz ǸίqbC6C z Q҂0`Hfk0I6yK1ST*( 6V=1yQ.W)+0Փu fw7\Zo2PJQQ4[ 3Pckpj:gzݳ%Ǡ> $g\Kr1$RY1g^5F]g*z=z.b=361/ R]΅,\EUEA SZ{bdHz@j6D+7hQb}V\ 9HoM[Ѩx-W3osqάQ 7ItEW/?oB JrnI/8Z! o+JB(kB.0n%5(X*T.V%b8q_g"(3V`?n"jqJm A)8k#Mo׿nP  iKq*ѼQ L,γ?%ʝ|(i * g9P՟T{JG)&(O#EoXiU9$>ͮ/F< -\oE5|}}[;`φGا6A+R3_g=gƳ}7{Nrq/|MH9i)񴜾Y7f"+T!:k夓GԾ#l)qqk3j)YO4o bZMk|o񭈬x0>;\d--qRY6Wde5ۼ'Jw/h~;#Qw,8†͋:~k%HTߞJmL˩B5;{s6+!.ov t4f8ΜN]fg.$_wi5«fq"rıfo >#.f1a롵]* gff*6+h<H2D;{{ f.|q1h q/B;/Ey;E]o5+|+S3:?M҈&m6StI$.$ 6IB&}Qojg=ZZ,0FU f-v€ua` ɣגp}>[!C|,WKx1}Ly47C[qDClò`G5`/6Y*c_UrONNfB:y|d[X>Rċ/%$ 1;U~ْ!eF[!&9,vmɖ$!Bb G1*[mP-ONRtnjՒ7UMzUt: 'b H$ b |IV Ƌ3%'oX.c[eos7 /wxߣ[­O1g`=)R&HٲF2 /%.cF4bW,FA !RcvK[,,rv°BjYev['8FL/ܭc $٥wl{~Vi/=r[ݤ?Š ҏcV~ Sbxjuq '(!ڋX8Gb|{{Yof@q}àjàԤqN*FhR}e4MYZyYIUX>GX_vL* W [ [~Rфߗ<['u\xj|rg+>3RϏȥqLըIcUGpr(zb4ʾ=4dՁLcFͼ9V;k@nlf)q6:6B C }oęq튷oV̯r/VӅ&rgKNO*[h]qmT> &3okJ-UU %I-o-h@y1R~{z,+a2ݸǖk\|„{F}r>>2ix{~Z>eb* y3tD^^%s->Z]dOb$ 7gM#HGŵ`Z=zP Izֵ)Euľ7:%!e;FYkNх+PQ ~XȷI9k3ZKobKB%d5#jEK4 ì6вjnXē.ZT%i: bƦ{6LZ!rKȘ5E*6kt,:ZnPB h{lҪ6Dˊwh"AZe;gl.dL.Vdf2Ci@>1(h-i氡m,+E A*7_6_dp.C^ [k#9P3f0Y{\Lo*%ǁ-' C8;fU>(-հ^{C *3=n@crg(Vj*2cA"9*5!䄌˾ e70&:$>{ < چFzD~ӮB&1ih2" 1*yi>|J_` X,fW#$z* *3d :I.&yc}D5-t_pREb֠5kQa7 ."h<0{K*5᫦@&]MP$nWB8FKw..d!U(FNp"bg"ܖM6"I|5>}s:+=qP'U1UҩVME%V X;KK~9hPH9B/sM rv@R@"ihʠv0a^J qK9AhWKEu  0]CzxJ) ;g-efcjlثyB2΄`$(jn㼱6hiiEj,@@2 qB&#rwäEeXJUwU wЬgթPBЌ :PkF>_iX=VHY 'MSTDH ̨ ᭻GkTZs]q)Fޫ]D}XHPmw)5l@l}AoQ%\aj\ F꼱`Cxn| is2kdU9`âLG`>b;4F/mh0W`]˿h ]U2YY{ư2ZkIYNy(YDh4v+ ƴ_ӂφHzQ{14pл!(C^rVrmȈPn.?h]4(Q娰)TP=ơ :CJ{$=|qP=@z;z# +E$hX[}2H8C|'"o/oV1^8R cF[D12F3~ŀ\\S6qlEigmlcUy6@zXV=Z^)i G]@JdC!BD=?jeנRoݤlƒ?샩e\x{$2awU|di#bU1/``aegZ _v4zȕŌBjwBS"qVd=Kdhi0j 9 8؆Rcf[15DEѧâ>[ZdjwҔ0rJUEY 59j‡БT[SZ,tQ5**mZ{]~7Xx;R5wڸlo_^ܺ^oŠ;r&dT=uHD3476oМ߇Qͯ OvNuIAUw>ۛ78~u%)6./A(cG\;:gD'Wۓg9l>Gzݞo?&>S3Dq חڌ#mO7/N:>zU>n&>?Ź7=zUNOps]mv{߷rN0MNbCH ­}$ܥH I>{h\B}$P $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$WLAH !`nO>WVz@_ 3m!HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !jIY?%hՓ!A=hXNʤ@_! 4HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HHmR~J$*O2coۓ!`= +o@_ d?HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !!Nwh~=j:zooo8obn}G\mD܍ۓaU;hSV_|=|fuTrjώlGߌ~M^T;)/Vp<~^26m֗ē2QDW)_j|uJ^<¿ts|Vy̷GN_]G~g ,H}[a3srn jM5QE%�˾EO]x~A[u|%.XW>^$1̱N?]\j͟?۵oޡז}}߯pd4ᶼE~E7pاf$8]'oߢ-ޭ8ϟPt6*̐ n;-k8}z?.'`;mܺ1~bsSy#Wk 96ryn[UAܽf?ݯ^V(7/鿷wxjlwW3ݷlCپ<rapjo0&ȷ[uo*fy|ۏ6s~U.]|x8ЯCFU@ ]dj16XTk3$5I.ˋ`(4.ک*F9VM8w Gek/6R}8[bq9]\ _x@51B^<U5"b^3INR% W_'ζ &6` 6nl츧H)hUscc J+ZZI-ڞ~NOjze 3m3ǹ99wO:YՕWOKe2!NF 6(VwKԛnB00$IX$qlx{yȴ񾼱勳U_z8rw8=9K2wsOy&>4KVD9 e:W1r6ҸV!\wwDquhYh[nE]^0y5a7K-"|f3{@xlMp8BR0 i۫ d#5:VtU'$< h}s$+W=÷8$E$#eNzLFI:mNosG$/&s$=0؅NT>0htW]ͽKs5gnԡ?n}g1vgդ8P8q29fr#(,"ި6bq,(,PS\)RRq0.ZzEXef)[DRC:Qi]F1Xɳ+^JfBN 1%"^8 Q-c*U`C}d15R!:uV< ʏņ{zY^JsL ~y"$^IS8_Iu_YV* DNYl^|`id8f?hڇpG""!@qdSU(IyE0@BeCl~|tAF7 2Vn >Rj8@c $O@;\:ݐ}niHk:3;t}ib )Z|2?]DN8X>FE.qd{emj=)l.EVA[䑲:yO =?wHv(KIAHFD0Sj-3K۽ip^BWg@0JhC(XRV.*l&*Xf 1_XW}ミ3)dS$U}1״$:=m/r)#qwr{!go'I60|P9Le+sɼ,T'o/'b8 b.͗N/v;RX% lޯqQRՅz/B# T=GTi0˛=0~qF ULzso}$Fk*(0xSq!_1K6n,ͲS8 qҳɗ=SgR',)Sd6=ؕ_`23ܴMkJ-SSYN+g3&9_ˏ?b>W߿y /88WJĽIxw/$կw547*кYO|qs>bݥ/0 PuYDLI< HhNz{@="$+R-wVZe|Q( # XNiFIrJT'=avap䮯)4u;[D=`%zg"ͽN-:GZ۞F\_t&v6ui\wTJs,M#S| 1asgO=\_fO\G):TY_}]w٤WHv8ظ:>g*f#U:y6~>u@:Jmjw4YM(cur?\,g?⌷!K~0Pv}z%y l۰)]U|7}{&Ѿɬ}m xx KWt!/кڠ xZ\0fP( ^z>wȻA:W(]QUp7oV6aU i=ҢZcU;Q20uWD|:%c>gyX "Νڛl C8Sď}Њ=A&220EM,D $ Tz1'2p%2{H& QzZ֋~f_@+'`8JA &86b*ŅvA~`R[b̦B6Yu׶ft@gPr\_dyoG,36`gA8^dQ.zBB1J9Q6r M#5MĢZ1Xr!$* "ZKȎنYY@ÓT88pAq)EFZ^& pLc-a'*,/m_g%sNOwreSlP!rFBB,[JL.0"~ &4ْA C!˶y]QLJn#sMQ`%,H1H >(.h4^&,YydbOighe 1Y,-$;Qm솄p&ǔh ﮩ6J OsQgHjW@ἈƗ6|?b`9KA>Cʱ(KU9GQ4H`"ʝ54[ N)7o&;l*aPSԵ+(v%y|ۨO㷡WJԮGf^֮ޔlɇ_.Ha˃ti{8Hsˍ!QL`Fybw1-A+@9~-/9kS2cH L1@ AVxd}&jc;L>kRNq,zaD"O3ø)aRbmsrCXe]XZo|}Jr!]|u.FpN&Q yۍUH;1 ^}"%* >J*ܵ9ȯ~M&l0g9[gYHD3la*tlo"6J2ElS>0 m:RH𿋭%+íZкjwtfu5gt=m+=//[C*R Ŝ:(i5{];-9龑m[66^7^Ky:]4F7"\ BYysW8;( , sfˍf(W;Mj4uV]bΎ" Mi 3Y@8:OV-i LX *ǚ1ꄔ I"%LQD-4;`Q76Wrܛf(KCHFY}STLQh!Rol0JKzabM J>4`\`noqimt%%M{8zX"S hC9yaZr 7)Z'QhL:]'a+Pdy&ƻ Ji57Qa&vEa"௔T+*ZǘdU"a!nFꥼ2+ra]qz&.Z[!'II ^XNcP1Q:d:3:=y RX렔&oY)#zÂdMH#O)tR0rp?2rԦ+)==|T36ͬ)]g\4rOd]])ZpMmyR%4o/KwGF0vpnU/L ӌqzH5Gk$~|~yz[mm4? wOu| >i`QޟOFc7 ͊1{탷88.S魋Qs$O?L/L)FGԞ?w|Aљ7h|[Yx{+xzbtvp<^4tiB~?^8âAjUr*MůSs$t~(HJ3DZ9A!0[ѭ[ZszįyIZz'};:)Ҳ+"+Nu~M~n,=yb-UN<'CgaTj\$fK1+Fi.x|ߎrn7'Ş;gyY~Kv_qJ/3?~j4:ǜwheȁWlq~Wgox2rVt̘-7af.;Np3[~ӹX,م[n,g֊Z#lq3&S4S1N}?i4N݅҈k'ЦkC˙SOsḻ+ O=VGu,GP3g&%0ZXp6J}*볌9yHo(8'3Ҁ0l%k^b9@ZZKT1TVj͹[QY≖xS_^/σLܪ|lth8)qU2IX`fbiBǜmB)3r{c!ǘC`V{l̒R% KqZ}6g!EIi@62VkXTj)X{,ܺsr^5?q2 F/-"w֕}B%Kҝdyji(%c57r Aԭ*)̊{> &CRؔШ`J𠋐}wY(3SfnD܍a4K&殠v52jC=|1@WD0s >:"gΒ}Ƣr Y4L9dpaẋY,db\2aML (q$|zQ͕ZsΨ;`D&"GoM3j-u4H,h)#zHy] f@me\FP" d>bQ2Y\NAz E@U5nD} \CvilUEb没 =U"+ L6BH1JGHC[Np6pqWtrn*V}ϪmNp#y? {"||24~ZP.BBlVpRq&h!R/) 9V@a^r;#9GHCv'!UÌ.`s`Ġ5SRYlPp+Q;XRm,.y=_V9dKH3؇33 "Ĕفħ<0YsҦ89 !`>sAS5K9cu@LKUU. n[zlW Sۊܛ|>ٿONk$,ȑHB*⻭eDy:⼽%KoiΙp7Rsxu|lfu]ν%57GjUCmIL>r#z+lĎ{qY]+LAt?'P cdN't 琢VM 5f$y}cjޅ�+wmYÒ^ [&ESRuV)ېoQZY@N8ͦ)C`c=X 6,eQ:h-;͙(]B*mtF,gYuKk9GBĸYJb fkӁCLjjekRwVX1->0S \5g /N#csr##ŴLVQK|;@+y. ]W 8-)4WM}VͯpyiMٗ-_|5bP|#@+t֩TD =JE1 ҜNrfAJ(7G5sR%RʭBڠ#K@lϩZ)XO]U}|2 +4Frĩ򎗬 QD:3*hchcϳɳVJ3`\WVI]ߧCffݶ֘{W53 ; Q3F/O&%ziwahFTq->evGi/x>{|2Ԋ4+wqz:bf7ϳů N-]zQG;o|0;Ssfp5jҝ~Z˲dkX?+ YlZ)eh`vsD<*&u/5DvEq}Ś+(FGetE43SO$Nvм;3MM_8{;0?BԒcZѶJ pX1/g{ttH֎L徟~W}e[X/%J7$ɑN̓Aa4dZ |QG0(nn{u )_!۳^%ϤFE'WϪӯ~z7/~x8_wާǜ1w`ǁd!-Z]{7nk^}A08-5ƾ t"mY%TXF` ~ .݊"TJڊ ۲z9֘djF>XD K-,tdȜJ ƣ0RV]A+m0d4@) }r06I\H68Ɯ R) AQ%WPUvt֜5*s#IܸOqps, =Ɩ攻]Wk:}C󈫁PWRu C7UAP&ō֪06# )cČٚ$V$Xn"kn5m= l?4LA:2M Aq畓2kOTVGgKRGFBHT2!p&ǤAY[l)5&$ɼZsԳNxj6_EPCZEtr>9\u1"@?:30#\$\x$ W* GeT562Q\dG5 B~*u8'ZըN$vhGF^ٖģ" LIO0Ұ%1ǒaE"=pTqJj2Q,^3~w3VD0*If^h^v",)T9"_Flt?{ƑpdGG.&6GCM)p}IȢ1[4{U]hF+,"D^nmܗD'f̷{xz *!8 $I#P\ =V+b{ 0VSv}C%A/`5&ޗY\n?oDq:;$t.\'JqwZbbqtkK($HWC_{-itYyPZq=; \2g>`rr&"pee6Dc J@jZ6B  C 2X!xm :(II2$8RE[#g iO&Vb/6:d(*[):Ge,g< 6e,)?\q Ko,3w4^EnlZY(H94y,:MHK!% 1j wa > EqQo&ŬOpvǕOtH&9+~%ݒ|H 6 gmNh/6ZE/`lAu4HcLHtp F<O2fe:k,9eEBJ#H*C1&A@+ p%0 3a1^4J,ťwbJhs3X >uai(;$L2Kx.z451#AIpP,pR ?6q$I||_f_*qϿ65̟46&P![ i֑'W7ϸ!  q9;|gxgu8<;ľgZR Ldzra!7]т,#IE8Lbu> ,2ȐxUFVV ^MWւKY aܩ 9Ԕ``1~v2N4W+s͋y~mT?O^cFվZ;#mFlFɍ,oĈ FF U,'pп/zYomrZoU'iMs빎KzHXp;_I~X!+k;Url8MRW?] 7R,!]WJTF47⬋uҵPٖjک3bo0}\)1/{ݛ]''~ {ߜ[z^&K$07?}Ъi.C6g=[Ƶ)oo7ǵN_xb Cllr!~)nxnՑLYOG דVLa<`)Lj:3J"c2IhJe6Rӓ>a.\0dntsW0N0Dۡ [ܩ2 oS.L&%t ݽN+:{ڞB\_stnL8쬁UӸ(F RYMp?uɾd߫igyuU߻ްj+7~ła!'BZZ;IBuxܦQsjנ ^`4>ln;v\pmZj_ܦIˋp&c{%J>lx`1MRo1G(鈬YvgiDБ- YN2K,h%U-Z#9eQpBYVk=YDycd\TEԘXYx{ R[kfl%Ϫ.v :zXӒ,.|_UhR8,>MTiX/?4jǃnu]uA:8` ~/^-xu=z˥i3N?nx?ktGeNP՗ ׃rpȘe.>+ 1Y,rpHd?WPrzS3 78E'5m$}}5~^Vm}?NsOD/~YVGAnDs.EMO]`UktvB?{W (XvGe7-;u170IP llP>Dltfbuwk ̸ Q#.6;zQrC@қ#+?E~T;RRO/&ͮ)۽Z70 57NJW#sPtWg3װڣFJZ:Q*[Q/B,ߤc4,MTQ0')0\Q詨WRg7V6']Gk  pIz(\XPVDaI1؄ Weg2  J;g|>'} /EeǹHn,FGt3lKeUme >jB!mu.C%1bWkiCrm4cn$VS%FJI%ȈAzi=Bvp-:<#U˓;)Q΃\hAT_jATxs :r2N_ZH),)dmTѤnAZ#Sd:`r1Mk܍iH&D| ?]\UDT%px9hpqvFi/q, lܡ6("q,Y~JJ.Y%33O,dgGڥ\pVh)31o+;Co_?{_nrQ-E9bWNyF1 ζߤy!\/{oAq.j@ZL1:Gw<|>Q}rW1?X[qSܤa^0_-K4@UDRl;V1T )y$J*CB:T,6fц6Fl1"IzEB Z GnyTώFi7̋{]j|"ɵS'9񔎘ҵ89a³᠞j#'!φu`$t<,$-KUOkuNA0}*M).:X`B  ! V Vd %H8 BgL mb:2K32lF|4 trz3"nOlMvs]ZB+T͒swb3^2('IF@V! zAcHHgIyb݁t`9:-9F 8=J(*z]CQ،!TԒI92v֬ȹ/G$or>\<{Gخ=n)r^iOY +ŵ>fyD@: d!FC PG5R}:B'.{tݓ:Ѕhub. J{ .BcM=@V"xp.E&Pn8n8>n@ԨFR^Q":R:DAÂWD`(,BʖbAn0v Um-a]Pwt=U.hO=pg zIJ2umD5(AVWD_R]u4)Eu鵔 2욷kj]Sfu=m{ZNm0n]ͦùa[wv+=/[Lrǿ;seϓ|-ѵ~gW7?묷dXZlMwݚ~:O;1Gƒ>5Pk#O\ ;(x׆+?]o-z~?8WʜM8h4GH\A3Q"«8 RFQ;D8"dY^1[hU>d,Td+&P-Ģg+!D:'v؊b:;1C&<;k+/xrOxEtR)rTHx9@@d2踳23߅z6BO<_q\DAiJ'œ$'[E !OtaR1rp>2r6eMɱacrޫ.ݒA{˩pRZRq*T;sd7^)RT6Ie&H]Hu4*QǸ ָ yVH Xuv1opT{gdIVڳ|v>dihCKo>LR2iwDI`/ILWjc*rlnËhh'hkH"T-p5qv:%n6wE[=B>O2ГQ(FLϬ6 &mQr!7{8]Jy{ylZg#OELFD0L)aʈimr1+8%v$y|txivwv*0R\YU NxH`|-e h4BXk!]ԶvJHeN4ݸ&뉕4}^tM{n3݌g0iS5I MDtoIJe *"ur)DBDB-ME ?ygLPD|ťB: 6*U߱l;_ Ar6&ī[7i*o p}hn? v7ҡnm\;vHQǎG(WIXTSqPB/t2,AL("'6 E Cd]$n{,Qh7XZhSؐ< 7p"jz}4i=.t7їrKF8Nnp~bi۱y3z?_6X&&g_}ſ ]_ 1{;Pwpeںv(Q*xtQ7Kdxj+&/ꗰc:Rjቩt)d S}j`=]vUIOg٨H+AH/ȩFL!NJx$7FOYY%+`\w6I"o]ڟMR -mwT$ß-&,m:G1p$ߕ0i7fZZ؜:V? (:Pd}@$3x"PI:QZtނ!2T(Dže*Ң\J8scc IJ2LT%POs :}bcTCXu>P\Gc:Pd(nlֆ/V}!'Hg/y$\ ϐEp``-hd 4:ikmq7^z5_3yU=WOJ"xƬHt!P!舋2 eiQ8YB6F%D;MM ]9U%U&g , -L UgG<'SD[ %LGbwʤ9~4 2O=N`\rQr28D&sDi⑖"v~xXmok4~lL4nE`ۮLΛn0;C7Ojb͖![ ,;( 1Ris@9AQE%$a^{\}J\i EQiH!$H>WZ)C>GcƠSbPgR؄ UUnd,Uaa1 Ua,<'U4Ef67e͏}p8>L#6 hDH!P\Em%jQ'߼4Lp̔=< 솬wqa+J8st&1a0*]4TuFl7XPtPCnxjEPA`X 3Q?#AIPDϨRV2(&z%!#3h# քHb;)$QxrfT#ֹ0x(xǾ #Cĭ7Ji) (.iaC4eՖ@hJ@C@x8*iY> @nԤ|t@qơF(O9p Tl_:j܍M:GgLXǴYLJEptԸG.YQKNJ9!9{Iz8BjQ!pXt =Hئl;w1N߲Mn#fy\.a0U^7p5ގYINWq5F^Qwo[ J o[t֦ '$ng2UZ_./ƦYBz+nw,˭z^jl7HnoqFҗ;pyj'~g~1Z _X)?/ڔUBZk\<=bzUtNi/9~͏/;+G,X0-LĨ9[j8m^E 2a|V2@[檣Gd2W$98b#Hu`*;9L\&KEUAhf)X3 -V~ȩۯIı #'堍!iReL8D*̸E2^,.XW0YLmPsn[͔)|GݕpC A;}uo*$N whϲRy;Sbjv6 9Q| !3fjuL(C8]e:%' ]dWq޴ l#r;rcPnlX})%&@d*kTC|T>(c&t< 19^ 8) 8. +#Ū !bh(D&S\_ k!Wtk\S> ]{WkH^}W:w8fxw8pRAˋՃvr .SGMGKʙ^hH8%J,(T ft7sz=wW3yv˟W@t48L^ 5VΓHe9XORyfIٻ6%W w~T{Y&~}`G5E*dYC/RE[l {zNUv BD&ZV݄WHu[6)bP ,HH 嬷l(g\¿c!KeS|HL "E&M}Y*RWxĝ)o?WG! 醀Y:ˍ%U6hɐT&,JHjpI߶@g'}5RL2kRʉ+Ȁ,RɕmBE Nq8tv~-Ľd> oIY#+,<.{Ub QMKc6MQ+%!Iz'mEDoj+9TmWT·нx~4)M"p6Yţ*H1-QD FHPAfk0Q;O<`l6C+гtS OaMa>͍OiSEx-K2: QyI BŽytyN6Yp*O&D@hkC,hb{(aGeMJFܒA$H *(Ҹs1Q{ؘ2H J i蓎FC6%CX3d%U侫֝ SZtB.d*P((Yj9tm%Y .[_ȤE\dDv~X+Pbn\i7b/e6',)$P9P zMKa'*,*NlE>V F4K^%pm'Mla@DU1)05I3 _ &f2M(MESV)6dp%Y̐ m R70ғ e<N{ՇK"WVVM"Dˎ{Pp\@ 3^zaXJyMYA9u؜jz_Y3JP &i)O M= 4$Y=L?f?v;԰8\4]~=nU@ÆQMkgZ} ԠPRHjT9 @m?LtgOi_U Ȑc\iuwn~2H:heZIiqȔw9xC)(>__DϞ\UGj:~,nrqUfN/omB8Ł'/,6i~v3;muvg^\g+:cIx7x6/Jna!k!ִdYɺff mdyɂi4t`jǣ@O/ 9k׶ݭ.nr]Nkr1-d6lx8tΧJS#r_6YʋK=AZj/f\_,t%+ )xJc4 8 M_bv[iWޢi@IӾϻ"ykfSN'S!~@`2fWm%LuaxtM|OONȜrqŢ*,""\6[mhLSk) 岅'ÖS_F46Tk;O)(t*o-2A@6St S,v:l9P_ﳺtnt%쬀U;PwP'fP x鲾 :]O]h :xz]m-' c ݮ f Zȏ\ܟt34(bpvܜ:+6ϚWNku7gg6P+]62R&cF/< dHN+~n 9Tfot Bĸ}5!BK͇꘮kOat|nFW [LoA _g_7h }_V{ڭͱE] >8^ȫ p>K_W칭W+ mz$e~֢{MJ'Z,ZhbP<#u+;t?Mgl m}yi 咂؈!Ug# bn6ߤ6aL E:nru),*t%PL.Doz:}^[5J=u 6Jb-D!FMOlB"jga=- 0@lفVhllJOm- ~ u<% օ6@ZgeĐcp5#J5Sӿ) 41R&00 ]hRx[)H(" `b J@6Jn­Mh]bLb""˪ERBR.5}@!)"e0g̈$cA۳֝]%- Mb+na=jL,IUa?bMؤN-1.hX庑,l 2L(%\$C̶bavh/KەU{.x[g@[Nےٶ\ͯ$@w3Ը]]`qsF?+ O۷= $cb)@AJ"o.)н"i蓂s|gK1oq9ȔW.䂚FcۤZʄB mQ:3'^:]#5XU>;Wm@YT("J%w0'P%h֝c։6_Nw5FoWcU KyהQgyF^}'lJS/Wbl뚥AeIO-ɅIT+3,!ijT*) & Nx{9zd}XޫS>)?| ^K`^ow#7J0ݿɣA@jnXo/B!jً<!&|y1Agl3۶}0۳}+Aq"(jV rd8>ZR@䢋czň`T&$ec%S:jmIF1Y'{·1"k3U-P[s>?; o}'{] j;HnIHlbN2; OƣLM5 eqRFk)e@)]M§ ?~n=hKJJ41Fʓ#.&t DD'{JZ rM$F#-3VHb6 3;&1t2~+lN'Nܾf[>m3Ҫvp~v<;Jsۣ1i2SXh4?ȨHxbvKwd3$}J/c𖼔 & D`RX^I;$Ww>,+qW Ǔ#.}G}Ƨ<;gH y7,riSIJHrSmLQI˸4KR'-pٗ':iDŽ6}YZJt54!ftqQc6|?ө_]YA65OjI[bTaw$l:Pl`R;a%iA B+{5$CQ `/U{6'>^пdu{t=唇w;9ٟ'Wl@Xk[2(VJV-$w9/-Y3=ubPR).Q(TVuFPZ p|/?{۸ v{䲷dqI3)(3wje[Z<5dY쮮Uu= !>q[X L"^ˈi$EͭI %2@=64dʌ' C<0D":Z)uD&;@Q#7'rvś deo]!Ts+7|.~7dNraU\e=u_ބn nͩ<]{|23)a~)sHK-mT^E&IY~_ycm/:\,ŽrIPO59,C}0Ż]P5En,]>Dd-Ko [Tk ɴ#cQpQ/%Fת\9IhM?Li-.i=5,^lìjkfOpqLX ll*O-l'͸dxp7?ճp&YrRd8JZ=])'&5ۻN@xb3|[BZ7Zۻ.mͦNoun iq|5b+?uwܿ><yLAb}ǿ":d/Zf]}$mT߷yVwxthV0e,mQyvKC̺bnlMQVFKf+fT^)vvMm6,uq7єpU}z띥ْm wDy_3ZMX3FRz!I)#41H1"%BLxe!pm}wO1?DѭPg'v>0lnSVW/}mz%`|aXRX=A,%œ"^L0}/N"/l|NAk \։@Njc\!잜W!n1'9˒䬲`\n6)$n5Z춧o(+҄3K5H,UrIO#ze6K`?{[UI)f q1<$yqbrs.:&n`C3DzF>0'yT,NF T3pI^0]4)m)WߏC>_*zf0^|lzO[\JōjX(̊<\aM4?{ZYRz'XɂaV:%fDC*1q]pk! p.Hd/޼)~AY)ᇗ[WIID>N Q#~fU-kOf|eHlռo-?|*fIQyH 3xH{6]\yv^h<1B`"7<Zgx7osָ8Ƃ+tڄN)эS_.eE55@1eȁ-N8m7WY}bo͉IhU'[g9v 5/k"mxb.(eGTٲ3h)jz&mTRGӞF];y 9+DDIo:6Z4m9u}'q@'i ~Rn'bXZ,B(ͥ *j?d'I}y>I28|3*AqQ!jSPZP$s^8AyVASIq˪W}HQJQOI}, b=QG37ΔǛry1K[@ɔp120 ,҇a& °CDKdw֦=#10Q.\tYMiw:}̛$[GCRpreqcO*h%{gqoR0#4'(xAcf}\|F=>0vAy|v'"#6sK?Nb2(h;2w?0w;«J\cr22K1WɄU9JJaD´pYu@VUgdYYMB REZIE+StSu#|Yy@Ҩ%\Y&OG;Sc[*ID> ,[@u~-W u~*8\e/2jb w4[9x$y|tzejF]cURa]& ||L&9'Ԯ|04dRJ/mPqxNd4&b+Xi&ˢaGTǿ}Px !G}ٌ0/,OfsR.@ltVv0GٷƮIywb/^o 7rk/C56$7H>~'\5w̩N{KZq[Ls6u+./du !X=@h L^/Jg ZGKbh՞k:FY^1Tk%i=:gm/qm>ۼ'=CW2T𬏋³t|p\^;l(oz/$ ӻ 'muXP46M( sZD)Z2li%% ,=;<yn#l>G:Jǵ=F4Ɩ 4rpT")pV#9&r6ȹM:2,"sGéĊR.k%)$1wm'0o;B nnj." &L0b<+0!D!e!x0k50{-#౉hj4Y 630FΞQ[E3?fu 0bQcDT@:+(S&#J B fn5`F0$5AZX,`Dg' q؁E" T)P,)5rFr{AK?aeJ7,[9l|ITEc(i֭NvMxrx3a˝𚞚RuF=ZTܬRyw7Iv#&Ka8:D q)GDzS6tc:#J }V(GkQ:JH1\-}Ԗx0DR%c6rvK|X%,BY^Xl~ hW usڵѠ4n4|,,ZtR%{ h -QovZaAv-yO# !){D%$ItYR:"*iD.rvKl;y.R;w*ef-{#ح)DžgE7D53@?Ciiϰg&<\H☁ 3Т d 2 .c*LJu3l얇RߥM\$b68T"Q8YPS!%V3" vJzVsWO @7QzHLrƂkm#DzO;)`ۙm 63t1ݑ%$' eaIQILrRݰHT"%Ͻ7[j ZLh4vDpQ{Q=\ǴYl쉋 pM#>5DD+ILRqorPxŤSs:DҖQ!pX8<{A;Xd8Ǜfܺ,qv򅀳;:cIQ(NѢ8.4voW[u}gL'8>$,ỳjQL96j|?bǹxZX_DGЌ%ǔ.F&-)LpOj厵ZάOW!xW/&+ft֖,Qs}).[Wp:/̣jg "r5j/)n B8M8^ ՟ygp{`L._^?6bEWK7vIB Pv(epB!9״i`T-RZ%VXv#P0vl o'UEC+5T!dݿ\*cT!X.æ"wPa">gSYAyoy Zsh.Hg8eOI%.DkNe1JG})EUGoN:aYSr9rΔyb)2b0S"+[,kQ"X^Mv:`VD]NjG } ADtجX&}d]-svWQ8d2R"E-kX0*Y/<>~P.r^RG*KRffO_Ey 3ڨǔen!g]ʤA}PW.R俓{%(v<᳷[nߴ7pDz9_|@݋իan֥y1ivwft`s:^'|?V9o密]ojF =hکE5WʹAX4u*MgL/fWK ՋkDT2.efBJIJAWoi^*0qdzûV7*8Kq<IpX7Ɋi  Ǥ i[k6dToCӹtp>/gлPZn<AL;oԺGR?[]YsQ]ƣjr,5.c?j_oo&~K5f~|8s"O9^y\=_yp<|0_].~:sHƳ:NxUpZh49EDL;B8{ױGJ98ը]ŋ꓎{\?Cd5ShΘos|;^xͽN nhi6~E\-pa^1h2L<=qÍf%j:P vNZ;mtQ$:ø8IK#] HqRFRAF \&Y H KeǡO#J#5ѪD=28pB('O˕.rd_U[h&ǛmI.>ڴ%n1ceӐ%'PgJD&sI٪4_%8,0|tu<@XR( q17p#GDB:QuِJg0K,V>h;2T^ZFAX1@Px6$( U]3)i8=I9!՛Iz}'-<nq!GMl4ëس9Y^>rvw{9t!9hKk>` n|hN XGb}; ?ŋj- սq?B# d߻W+x 7 @>'\$Īƾ:S:80SyY-J״;͔gdEr)2d771\g3lyRmW[n7-,|e!=; te8/].}}%7Vm!׮u٬.X tgkޓKדv7j܊e7ugW/\qn0 6\ovg=<A4=ri{ߝ{x."lϊn;ջx9ƽN$}nvЄN v8ŧ7{k69d!m}ީB퓎֑T :126loιy\v'iXٺO=mfĖJhiM.!8UxxYAs:%*E dQ%FZ/FsibfOZ"}zh|ϋ\$xiU-;>'Cņu_ggi=O,-.=FcdJ{6T3^/}^kCzNJ [Y'b՝W= ՓD1y87^ߎh9Y}&z}yU-jf γ$/lĂu}nCGiMhgDм)W-np҃Bd ,i<H&r))\:@^'mmT%Cyi OUglF93MilV~SoNiʆe<A#d.LV_kA&wT}[CӁSS"]-K<2|23J;wA'9,pe s"頤 PFO|t14g(H{NfpfaJ d/#qphA @Hf"qV%3ښ5gbE正դDhˈU*Bp|-i~c@'f\rcИ2,ƃXuOx9lX(U`b j.ںCn!TJDPAHv.X$g֠}FHPDXgT)'x+I*IZ @+xԈ5!P,s'1 p>du.Ū~? b/"ˆh:DqcwϭRFJ. bt[BG!V9XTҲ6}DiqS^I퍏c :c!VKj{IXkLXI !RR$+H`=&y pD~i Z:8Oe2)`pV̇~LGDwg~2Qׯjwy,3_h>帽$ߖFءGBGր3RhUQD|-2z2:\2h3 RT;dM>9^u iRs&`Vh#C@XWk`Z Th+QŌSךR=r<":3IA@v}m>OFڤG*<-iYyۚ3v;ZoI :bIR/T9X1D"_) A,3Lv0 I1RrI- . 1`K* k#H-@F)Inͪ#qjcqK/k XM٬*ʛW~JCDu4}&)2_nmޡ \s86JhA# <"'ˮCG'VııJ` $Hp'g{dؐcGS@ HV娓2HtBd #˭ʙg cIKY Kp`DŽl:`މ5r6Fzb0:F!KUSP8M#£ B` u4蘃#  6ES@O?2[m)L-ArQ;'HjBThBR* 9'eDZZq&m 9rVȓ$,f)0yB3TgcN% ` ANq88lKBxV 9Ig6r.w'|3("leHvA6r M%f RQv^Dh.jhWk,r[ზ·xxYk$*!85h 9@#AH*.s<뤕X'mU8!Zq1קGzzssQ3  Cʔl"($sQ( %G,PP.୷A:!Aqhe%a6ݦ+<ۅޚi9F?%MxV H:5,[0O(Y)`R1;2s̑)Wx.,^0ۋc˂=YhEX0x˥:MK!' ˞B 0Rߜ y+Vyl"R,Gw6['0 $kX+ 6`_b&%pѱt;n(ӴY%&0 @M1Y$bX&[cQr3!=NPY, RtpShJ5|{JYHfdLqGRK1 Jm+O%c2P~K ͒n^if!0q 8I5Rη0IT?y'2#s:(x5^Ni@z`p&|}WD>iuE2%o&'A笄Re7n5xU]SP@-6?d+Jqj8Gչ>͟f}mTinN^+Qҝs;ҍI^ +10b0M2sY DcL:imVFRu~Ce O,[%L*&v(D`v* ò)Qy&I8 `} BʺVm}m'۬|t.tlعVu[wg[Fsw?C [y0+)2mkI|9CЩ%*@+]]a RWZω7Wp^»jׄ왕CQ 6ɀ*d*I@2tQbLO#c9Z4:εs% ޢS[r@`3kTԂI96F|%zr<'y>V(Gy_yYA/=ן%$wݰA ï˙~P5vw8mۆk;uF?~gT~/E1NlT@{zߪ };s*ؗ2r[<л٥ƃxݭf]w?= :ꤹMLGLջLLgI(\ً(Wy7VoHNi6rrn|xމcX񰏋n7^gV)sՇ> JLDzm-mɾ*̵䯻:@j NDY1'Iv(CYE`vF:Z ^t\Sųc{k1 57YtkƇd%, ޒgLƔ7:+ J5PBN<0Mp4 M-{_#R:O7[ލYvK"#L AGswA:g9!ZHpfVȆbv-rZHcՐᅬQSs,+9< brx,:JJGkTJW% ^㣪ƯS2,FcV`y_'PO_T ?<߫DxRy;Q~V޸΅w8Ndg_1V'~/9QYߛ={O_8vӀ5f-]6go;H, Tp!I9˘e.*9$dYzp~Zܫ{q\ɂF=T~eN?-O oҿ ^5v[>'%N}8>ijdiop,_۽r8+ h\U np빌WH Wgr5zUr)1ƛR*uerſk BOld3T9$\3<&R\t%ԝ^>-UQNZ2%0"B޿~-H݇j;>yfGI5ݫ,VR,]j-BŁ);#zބ_pE{4owix 9rh(Vm>󽥆c4"w,`Ac|cx.Kw}!h:͜0S"OMnik xBm@LU٢AzF((PDX f̆LhȒ FYK^ grg.)ZR/E4`yig=,k A "dž <3b,wΉYtD敱!rnȰ`ʉG k&UƇ3nA!\mkgx2cVM&!(kh<#{E%2RΒ?{ƭj61kڢxxig䐴*K~4WVe$kGNEUKq3 푌MbsO|޲t1Rkb>At~b19j{"$agR8J/RoReOyTiEs\w17V DQDH)"C4Aػ6sm,igq3rM-L{j9S%STYC>U`NHmAYٝ"98Bh )| !hDiҙL0$Q HgW@R{HUP:%)zĥP4?IYK2=1ɒ\51$%MI *Zr%MqsRҭbn0+/39i_2w 3q-ms[ơbT.s'ȵuXZگb4_W1^ŌS;5աK c/s?K7fŇxjY!ׄZQigᢌR8&Jg) r Z..a?=fYR;Xmre5z\(w?|ze8*ou}.5(}X"*`r"$G: R,²^GM[7qޡf<\LBJ[snA?~|}pۼ}sˈnG䟏?kp*Txvς5@*'rdhDb{;8+,sOcߢ?GOTwj٣#.7<Üe*s0N<_SJ:.)*Ed4ݿm%>w(6jyN"pٳ[0t0PGWC)DXR(8%.\@{pT@ZBq4*\Ixn UwJD$DD 'RSx܄4L*AnOp59+lTbr;k%KS^`.)Rw: Df3z.PDF#؉UxW輿`Lpmo{zatAMF]_vTY0 U-c<4WIq'{@<PJ) 3EQVA&9M;-Itok n/]J&91xAG""e`u<!ɝ^iϺ錜R yZ\8oROFAX&/ `!&⢃2D w#7#rCv anKwB )h#_~<)]U J nn:HjZ^M'G 4T>0ǡͩ+8Wg'֣-- _dIB3຺B{}GQq8h=?Ɠb>-ubilaVS)xRV x_AJ13N8w6䐬b%g`e7Ȑ j:rҐ1hts<*d~aQWaq8:4rWbwa=#mK=ÌP_h6x9%+]2xqZORa:\M?Ng"|,R`vzE[tlg"76Vqfs9   o +uezs8Nm{6facϹun7Mﭞq7b#?5O7.Ǽ ϣB-2vnx~: Ѻ l45*xzY͆ RU?LiD6>.aUݾPܻ}ypzL렭Qe)ͥx; 髉en |Isн[BV#uP%΄qrJ\O0<'**Q:' =RL>EK-ބy)i_6J ^TPΦOctol֕/eޥ6"9EV:!r*H("TdpK :RF{$!I1^1|)!9p1:$lY*!m5BHCre$H"\(! 4bc0KQ!LS1;OR%'ֱb茜a5sKz=E$;xg{tOo(+/0uxFeN1L>'"G$Ix'$Ep@4e!i%d s|xQ<_S1Qg0>bL ZD(4Qߗ-{l֛ТEfw] u/PԵe7 lB|v'#ԅLDNࡇk48;ڼ,3YirԠq9Z9xVG%֑{hN,8|~Ehte(=Qё\^HhU2{<(GS{Rd6R{׉ܫƃR޼WiC\X3Rh4I9>MRIkgڀ+'g/5aip (QP:c6(c}>A~dq?E3 ޻JUX"XOTA3I ()CZM[tH}HM*Qϸ 1 d$S3rvU=2K ..2+7!Nt\ ,~qVMS ǫr5}r . # =F􊋦ѫtP.T˸Rrnm^lru+i+KoJ(p1'J(cmUF9ZhzWqy[Tgl3e-%"ʺ&kZ2 &X`:*Pnw& jD[FTQe2?K?P@BFuD;և]VF}htM{1V)#%AY1Z:p!YMI+mt@( j*i6}DiqS;D."@uX#EVr7я(Qg{8Wi PLKuZ LE $P+cdxD,fdDSMHl쪮zջVB.c[Pjaβ0f46xLPꈫ$O J "uRt;J1{.Q ً@H\BDB so' NJGB&g<ˤi!5ES}zۧwYǛY77XPx%yZziЈ&DHM U)QM= R 0Mer5 **~ zjw-DZ]e-fᠼbԂSDkQYHJx4ē!oN\TwdF[|;SU7W_BɼlP!׳Rq8i>^

jIӝ8֧߇9 azy zVQzO5|4Sۤt}\?mHjDdR8K<+I¿ ċ6 E;'",)W-n I6!jz^*ǴV v(JInDّa*& 5Ƅļ?JNvim:;oK}r(t׭/pr}?qAn '҆3eP&"7p.2pgwuKNد W/[Υgd;j[5獱Z$A <o -EN_Je4D-hHZ A!D岏$!pz0b 2^ /Ҝ F Z[P @CʔmD|~%yCWh/S/Nu+_ťs$9g й&, f-*$R8qgMgC>_U:2-<Ý5ZSBgb(BǕ@i$$Q4q IU(d[-sAgX7[ridk,ݥa(ȡ'.䉂M$JQz")GB-ph]zm)pSMyLv[SQ3 4 W @.Qp!F#<:;EКr/ŖH\&q@# '763s *Q2Dm)ΕĶ;ǵ6 ~.>o֑=<;jؖOt3T:_tP1d2ET !'4ML4~ȡ&=N9T8 ˞ "fx,1q < U!CXMJA# r 2(.3hߋ})?V{<# b'76sQ"jN8%EI WhoƄ`9 ƻ|F+qA ? ORM0P! Ls U(T8MFI9)!ңeC|#H[S` yc G-ch!!&"-%R ʙE ښ}(~ ~{ŕ%!jzo?z8'ŭNgԠ1erGL+.|p2E!xq}[E 5@x*/u r N2`0ӧT,WC!euBuIJ/<>>^nT,5tap׽dTh|19C]q k(UQ'ōw.5:e~Mulz^jz2Sor{+$HjgӪīs7X.n/! #)? !?z6A||Di1W}cĩIA7 )R#kYB/LăXW \!oNY)rǫ*_/7:G&9_u~xsS~uKܗ)\q SAl m -Fl2m`\|q=UㆋXb; |P0٭;m{kE6IbM-$$B޸tR%DZDUTڌ!R4>pX=8Y:y+1 !1érl6tG#5a_^F_I6ޝgssXՅF#y{SɕZm⾕?TԋJsq\_B^\-;x0S1Pz6 fp"!x0=G$lh<]6(׷?[ijh3Q*U !ޤĝ0pTR֡ 縏|9d~(Md&Mt'BRZ:AGnH';L~L>g*ľe*lb;Tj.;(".(7*EsKc9)Y`7mxsFU7r"L0ϣR 2 Hr$-L%bN0p5@ Νؖ[ΆnwD]jo K.ʩPh4UO׮^ ,nntZ=v5Z I򰎚)eLU1 .iK=q Z*Ke;R\8VgMO&0QAqNLnt8+p8MMg_w i&N9l;x$ѻ-5|R8&I"[–1ZK; 26ޡŎ][g:Uh9'YnQeZSq@,ȕux[+dM׬hp"[/, ! $F: $ۚN DZRLhIǤ!9QX0CKT6xNth&j;2~(j%["W9nr^ij72,~ɤ(L`8Z|wd4O%\ >"I`T~(Vxg|_W~grpP,>OE hW{$h~MX{x.p*5H+],s^ VE<$mQ*k,d98'u~X\+O =S* /脓IHaXL>aaMsOtu9 6!1E zsAof/oejkȯ;aTړ Wix֧YsrxBh& *&f2iI%P̢(tIuv36ᒅYՕt=9W.C+ ϒ!8/Y!_~Ά݇g"O@pw d`@I[;HIp[O[k춭3?nVbdpJn&gWv֔[+Uw8q+CoᏕ Jˢ䃝%g1U7BqN4U:!3Dzo[fҿhUn,`!s8tbxáLHI]iY/)}T?jZ=OM(LЪ쿐eeIHX &rmjEY'wôippM cwd2zD< AI\cCMB9&K;j-vULcb^LpH +Ufd)"7>|WYSJyo:Vfu3Ǭ DzFFKhF)gi8͸ֵx`-,J۾T7g#0HŸBGf 9Pt C}: ObẒ(GX8ڐ\KU17QtFQĥ'.Fw5x-e#`dNʤA Yljڠd6USkIWM 0AgRReB9Ry֢BCcLUJz{N t<,X$E@D3iԒ|,BUP#1cx6&*[cIL˛r&@>D]h]66Ii'tJC.xSf@S rlTIn3ܾL =S`N6Kos <xh,ӶeL-9k`-"tH}n܃G}yt9{nƨYLQ9a|R0 Z@DLVi}tFw d1F0ᱰEc$ J n'#J=pcLә8wc1j\^o21k*חڼ4Ϋ7_*b!y d~|bVԧNۢW}:PVfLɛjVjn\/RzGIhR /XcCd\r4j5!uFyVemhKc.WbÃEZ%;u97(8ؘ\Rcv́!6=x5gzu͵u%3OξG9Pr~m3̹8YϵuWݶɖyk}qȄWn\rL=%$JFUɨ*U%dTQU2JFUɨ*U%dTQU2_ɨ*U%ʩdTf<9-9Ns>I( .C(lN!jIČw͚Օ8wkVx8]ɇk~l{;|֣WTheh@/|o (]އլ} g~Z>bʞ^D$ߠ@:?A/,h(:sR :9u__&..Ry4rWr fe6!zXta4N4-pKKf8Y<7rloa2MŮlsJHX^St~`!6궼+fץrA__ 72~jPvu3k{z;:];zi.1r Qc6"Kk'C:K2g&ar#H,%yяr9KN_b,g|MA[ppQnɒdk}kqZ<>]dY$L&uV,=IEuIǢ&qiW(4>&àax)S:fٮLy!(pѐ rLA ӳXMNS-y *XDp)">,XNNMQd#Mf8l9>yGC)өҰ (i|2넂 "y(`=:IObP#RVYoB:{(80tPn)> ^ˆ7.>s3"6gj+Dfk$%N=K t̎9pf>)Ы;7o{* u'i26 .I'Wo?)vWވD= doQ5R) B-YjOOWI/~O;Kt򣇭$wWWYGyYi 0q @r:8?j\jǝ^<W#R;d Nm}#G.|Ц!tHk$MK~dm,JPaU(kKج¨IE}_a?n?'DNPߙr4>2;SfuF>,Mp79l$sjbkS,'8ޔ*4WZ g @ruߕR XDh*ZYy~-QL0ӚNY KGһEvXw)횿^w/]3I#z\[: `~)7vLgg݋K]?Zd׍XhM]so{?LQ}V[ۃ v~{~fZe@lTtM L|-nˊ5`oS ̫iV^0RvږW+WBvR~qH;3~rp8Iל(rP5 ?ibsu9NY2=fBE;뉽wlqE#t1r Yy΢d:Jeg&^'4G(4C8zhGϝn3k"Dog Y5TsGqt+dHSW;@-y~ 1bw&?)j#Q$,A5W7: O,~~yFmu0 qZ-N|{NgRgd(g1hϽ^QQ5ʰ$9*vƽoH=a'Mt V}շRYOT_my}ukɠ&1g]AǛHK@VcI ] qJ8jΎ^OJ%KƢQ< DN`m@4f,,mCtL!T_E)Xh'"WtGnә!U.Db@f,E-s3qIolӋ>},9QоSKtr;ظoq S3UE|@<CR9ƫmַwmI_!n~9'{  60Q-sMZQZ[oaI%jJ=-KcB1E9ػPv`^kPp'8KCBxY{K{*z2#C@)uOIXU;nY mB@LBzܓAdw)zpsgM:J Bt1Tk֝--t0m-{)_ =Bxm{(xߟq'b#=:2 q8ȯRO9geRIug gيƮAP[:^ǀ$6z Yi5CXק9o"Fc{iwmgå毋.'7FY5y-M:uy|?Sy44Oro^6}7n:}+=] ;w֚6rkQ@]Q-^X2zNݻ  ntgZX;\L#%)Μ>-:Ȏ*Jw+!&ђ0՗6 EWMpV ꔃ8aS)$>LӥH fFbz$(q_ڼ&=Ej N}Y ݬ׻-:b3A=T[]TiwaB睚L.[TSV6 :Qk4ID\ Ex Ex Ex E_l޳E(/eƱ'Έ)X2IQ֚*!tC+ },ֱnd)M DQz\*`XZoh֝7o[CPܱ !;i?޾=iEp>}Lsk賬ָ,b $λy: 4Q. :e@v`5;<ޚá? j *LrltY*ŸPS`wFdTM% DFpE'=QJb)+$-܅,Cc d-#Yw6t SuPVPKDIVe)P=,O}`H h ҰHLLl XI,Fؙ:GkҴN m՝uA؝3,ep~եաVo&'/OѸ۔,Y=Qɋ* V |%hQ>R\ܦVa" 5'#E&:T6%KRd\(-rrTJe4ΖKnrA lz(ճ6!ʮ,e#Y*6R-c )m+mlHUjJ^tBlk2db'm"\*;}nlua{;+j"6]-kl`xo 5lA%FBI 2e!K:<$ԲO_0VoLZX*(Dv](H.R 0X VTs[ss"FbͤdGx]]}OME%Ycb$["˽3F))sX vqvPa38{(w$9ك {6ʢKF> |/xF'SۓݪEu0*msggjxT :yBK|2h5$Bx"9DHw@ !U H%ѕ(AEI]%<9D b K!)3[(t}mb].f !GH I,Lb7#f2<ԝ>\R֣ݝqwuX4qOXr^DȚB,X9,`"$ iSt)ʈT ~PŧI5\k}Ul_7ٹX~>B|tų*# ?Vo(1v*xKW!G`JGI,HDYcDDv"7y%?%k퇗%z2>F9Ppj@t:v?=kiT%ٟƛ".R0(!`;\lxb[4/"/}^so{h/š^V_\?m-8{#*1@;j`."$#C^hP"e0} .- [{ڭuCyVh܅ wܑŊZ%{u1K5Ay j7OgyCKٱ/VL&S؝ReBt[^b]!2@@ 2Bf/чZ+H$۔pѲ[Ma#D\Bx Tyom*P7 Ѝwٛug *"Β[TAOK[i_#ȇ$Q ,EXJlW_B$*RPoaYcN&'L-BT)׆2CH&S%>j5ZbrH?ya={v*)DYW RN'V `J%ƀbpP7͒-(0*"BFV uC&@)%G_e]1DZQTRkmk]SWO>f"Ac[UB=YgQ:e8 l %3Hid YL6-JbPvcUoX;k ^šd3]t+֜YonLADM@Y%$ &ϔlR9^%YՄ,:?cd .ZJ*t O9!YJn (Z i< !c(Bu{BBl8DbbB@s#H)' ."j2ReKȭYug5=+|ٳ %/:(I@ZV I$#1m  .;?RKW+`/5Y_Cn-۫kς&'}o,IYX'UPyMKa'j +U2D>W{b=)-Nc$&pm'uȼ @),dgEW v]t s)yPc_<)8=A=HS-M:IEQ2p%BY̘#- 540ғ e#'b΃"Ux)@ת$Ihq*SaLgKVk7[F`Cr fq ~6:P?w\؞QO"~]N9 8}ϥ@Ή-?2CzWo0? U=G[çՕ>ҳvPG ^>JcgeH!Qљ8:Q4NlOh %2`ʐV6 *@n 7ɛ*E4-}=;LyIRm;ސ,ggOիӳ@"NAY÷x*jSG[¾¸.EUyWO.(-qu2➡COGi|䏿7߿_}ͷF{^o%^6v< yT4N;vOѴn4jڦir ߢ]-?ny-ܛb7 i~ ne#D' Ih Sх\l1@sE#H 6uE]IB'=uae\*p੯cISڬ't*[/1h֧X :Mu*L|!4[ ;w!uyRwP80Sw>{y<[wd*m:rgv.f2K0H2I޵qdٿ"iavgv`& F"zq&ERZ%9Ablv߮sT k"6!:7oVrՉPu-jYzC.wc@ Ɛ߄2֫weqqTŃ>}hs\G*Bmyupph! X/oŧ!u7V&o֖6n(W2ҚiV㽲_] 7|(6i]wzw2ze 渔c2^,n''~}zֈ<ǃpɒ;y{J9^~+_c ^t8#U{~bK]640 N~u=/mz st߰}߿ì5_q/aVnuWY =ڀJrN+}aś5|0 [- $KV^]zuqe&[6Y^^<4Mk6umf3KH˃OGGDLL0{ ell;錎]\MG%KmM\lfex3{/wfgQPttX5 NQF71%rxyj'!,o{󃃳ڃfEB'5ue`H:q, }+^MC«qCUcV`-P!UjޣYFi:~^~9r#7Y)Iδe3#˱%eB%ilMkUt9En2k'݅$]I|0ZrG+Uή˥_!ѼV%~iIN$vA=6^䘬VLv$6F%=jQ|^ e*ZTj"VʹR3Z/T}Okli0RkWj5uQ"UzglR,}ִZɠAԨ&)T0nJk}k5x " cks-(iR5j]^e9E:zZEړ#j̗z%YiROu`:IMMԄ1"&Zj2:`F7v[\D\T5i)KNI8_Lã%ـKܿ՗cVI6lAhѸ&UB%%s*d0rihDթ|/tvl~%B!}tB$bhur6hLv~rJ"TjOH'=u,s.lbqMOZr񩹭 )nֽyN4dEoc9$eNJ} w:bSh]Rwh@mRhіbe_'R|!4fmG;S;XētROZ.E#цF.\>E=uk,R rSȘgE¥:jtALڲRȮPB^ oOMw_KͲnGdfvy@7q SNiE +ٽ*؀d@> `-iЮltG*A5y@yU벫ԔxX@h4yֲ&6b ]c":i6 ą5f;uF5`k˺ CBs QYjXkYЬ bE(kGU (|x+`I=Rƌ}cSN[5mB7-L[4XfTn&4Z?9 8SXd"zbs?e5̀j@o]\Q?A2n@Ơ) uڀb(4VH(,@Ѵ,;clQȌ:B*< 許Y7i;"p`L ӌ-P@gSO` bX[9 i+XGeұK \V9%a7j,X \I[(l NjC Bꕶr{FS՚vVR)߾X8(VS;őO7+Ep;RQ V֮[1/EF̺-401ƄULl^v')A"/ڄ{aV9n=FӅ76O[АF%SdHW=$F*9 0 CMZe&5:/3(|" H&jZ5 @6!+*]7xѴy{KH'*@Pã@܎m}G V'ES U_~^j lbZ1pߩqcU0 vnf+ qP'K = 2Xkp`DP(c ޣP.O9wq"xT(T—%|̡LBP=n;x  L)ڽXRr*a3m>%0ZuIBvp ԁ` (-f1c &˽e;VA;(%k?n,L$fj2RAٕ`?Aj8ӱ`DYUZUcfl~%dl%VmZNibI[6r$Lk4ZI0xnAmJMKKSoU4"ZF7)mU(am WN` 0Ҳ¦3^\ a4S"qfd98QN=ZMWPO,iJ M\ѓF!*ek^ LCqc I}6ՔF å!0r(:ff5$) dxS4\- Pk|X~:C+zgjp!GTQ~ԋBf-n^,ηUȮ%v8ᄙ!YDTҊ߬8 rzO[QD VIgvP.L׫ Vam6m*v~Ut'^TCUarsR`l@ y% p|&ПR G@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *"h}NJ | OG s{6JR$VZ>;Q7B^ ֬b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V=e%--ʈ=@5W+J{[k#(m7OH 4DJkY %+X M@CZ[ *-Y׼} 8b^_-lXgy M_חu ٬vώOdXѿSVR\> y:C|}(oVEY9`EdZ nњ&\Hl U +QIsȝ,r/gQ Kv%)aGkVl/Z1u PVTD4ΦX 4aMfΎq6A _X%mQE1mO)iȬ9W<]xWzȫ /U9 ߋyEHSV;*V޺$68)rgo$jdP=%7̓-G?>,߽l+WRWRﳼjkOn=GWhܫRnuqҴ1)hVˮKUTSA5 ‰cq/N6y;=>뾝O1bz᎜%۹/㓣z@p] oG+`%ȴX1d @AS"6)YYIHZmѰ-]]}7J"L% e˲XI| 0 "y2$֞D6%qQBWB|mkؔ +kHL 8ـ#͞x 8<ƙQ8IrdOa0hmܗ:xtR\A.RN̑U ky:wЧz5Rr0)\ }Tk-+۬Fnlٹَ ͰK-Etô)2 :7@ih{Yar16a`2ircHD#<p/ ŖE$-cj՝QiK"VlكΚeYM$AIHь9XTAaPmA |B LJ 1iC4q6RNtSPqOS?56 DOpmΣEa8h&|e }M` De> |d~iB*d]'WDRpqYC6I w YT9ңeI.!GsH:  wX2 М2`K-^c}5\1) T u.i'?n/6"`'i(4Jpǻ ՎCwG({ߛ?5̞2oOǸ^=Nq\TP~phC1F )WP;tU}!x~z>4$'8@H.ap6&`\ͻ`BG t(ǵUjxt>bd:K0/fMO|Yd_k{aت49$Tm]q;9NTs>.4-]4Qb7͓WlEgם9w랞5s.H`|A\>!YӒ"[ruKai2slfqB?(AEl6)UF6:V׍c_Ja2J3iPFK䤂+( CZ(U*ϱwĩye?[|}5SԲ`A_%M|a׏y9nUiNW"`b<.BwxDroO~x?o_ӻ'/x{B9y߯N޾~ x2Nõ]P›u@tV?m5 ͚FlӴms ߢ]v~]c*X!0:5f1 ݰ0#-O#Ig'y541x퍋YgeX&\i<%u =Q(ϚKB =e=NfB7}eQ1tm1TVjжJtG#kgzV:;eڞJL_tn  vVӘ Hn3[u}nVWi\~b.s_ gUiveBv<Ƞ/mQ4Zcv#uYZ<*.bүov8*ޯHןzoe*">~V|ڀjmY >Ui1(׫P;K Q`ҋSQ';s,1 njrXo$ק {_?A4Ubn$K^xZ7W`V^ pw4xܩ)$lJ^oyVy06\U"Y|P A.i4l"Fiʞ!|\Wi+wV_m%}][#27Xzq'(ʋ* 1DW<C z\s$ͣb8@[݅Ohgop٠N*Tw>ՇtyXײ,տl"@[vY_#A<WԴ_ǤX >/1UiD,j͌N`{S3b:fmj4k7Uf`qBdɕf`g 32dBg1u;Cݮ"~v3mdƵ-`#*zh䖅|BR۟~|Ûр?򮨺V&zv/n#̚jJcautu./;*~g6pv*>sJQcffQHĎx/ TcEΌ`,yg=@[{ծ[NcXb*u Qr0Q{+/RЎ}"\RQQj wԑ릵7 Vj5^DD?"j,FuZR ,LXK*VtHm{wM(;@'Z'TiEһ+QItNi#(c! ylڋ6mO=n+8C!.#Gr`,<^2yDŭ FjA<8Bh´` M4sL&dgZ&ֺ\Gc*rs4'o0!:LsUAqRwYDv1ee |/cx2L"&r7iCJi/u>*,Y9\nRXJ>f;K`K%Z;튯fkd|55πrM♽`s@УR.-Jg?| ⹔eJ/=gT0 qNwJ $> =H8pM ZKILQZ]F$IĄ%gPZZdBB' z,V6j͡Brc.O8NPX˘܍iL v)|~:ѿX1qGmuw(ot9\y~87fܫ]|Vsl؇?e[["84a e#3HXAv)e&g\o+;Eo_?Mcz/C(? Zr}y .90{%'7_uAq]ɋcmN΋X %0.dZmtI<?"r(|W;H[~ccLQIM#-F1#BtFjxW<)n#r))\KP%p`202^?&<[Nr;U- ~n~TgG^#}ۍ/,^GZ ۺb~&Mi)J/j=c { PG̾o[.+&,xZ10{Z >f7Asū_~*6nj|ġM}rwQ`;kd48 Ϻq ſ>[kV g0Su\ѹ=@As%HB1(-% :rCbrrwG;{ڨ=#ֹ쥏1k3l(oڇ9`D,)i yTJpBf29yԖ 2L1'Xr'j"Q;-ֺȻPM+|:Qbډî[B,VR}!n2>k,o?'sp2>FZ΀uLd3SH $Tac:= Pk JRe.q28ll3995c2(݉9vбs7Y_q^g9ff{ OQ3=c4#q7n.AK20|re<"QEXVK=OӹG0]˳<`T$Bx6HU !! E"`'xa3?]c3pyuUPBB,;w {.Th%gP"Du {lw*5[k{Om֡=DLFELפ;͏r|-U:JNRfєB28jɡ0ɸwګvD]'$ u]2R)gދ !Y69[$)bBa,D UhכJRR2@uM)zAyE$2Ly!#ɝ^iZwZ0S!s, $4E*aB2m\ +]_dE* ;}w}ëg.\:h;S3jԩFavA,l/pVp\ීt"I!գ--M Df:}:z>Xzz[T鸘NK#/;_ăTU}Kuz5bkw @"RSq™7L1eݚ4 Bd5$ͺѝM:Srm6EI9 įE<.➩i4rR&bsХU?] u4X# 6[F͠kܡoQ|߬ϕìݒkG֖U/)>ۺV t5![j@7_ȇ 2[%.t=~,]67Wٯt^n٬+qݺ}Vׅ(q\yQq?|}}bǼ͞y@KzOտ7&vfȑO넏@pKf n6&8Wl=,˖ܖ) 8l?~UǎO+\yk$S-6Lc5}$ @[o8*>}&:zZlYmG }z)* \d!T^t ;a^\ Bu{A6rZvҴGÓ$/-Oz3P\3ȖKx NC}_˜.TvQʆ{t^z/@7 _rFnV3Qe|5g847'sG8Yw*0h*8?ukPwZ< N֑4e3֚"tj283Ɛ `H*.hNZ SqP\0]k)X ^CEJ4Ѽ~oQLa5-|ȯMU;vX~=B_vfG{U--lrV6Z[Z5ڷ`/Ͷm3"k7ϚֳC*Y?izЂc>Tsp(GmM /Z=]\Tv^\{@E?z([{Z&P[[xh6.gzR$467 vRzFM`+TZElq8gyLX+&lz>x1S$gBa]w'D݅#b<ҥ̵;]Z,MV=-"V=}ЂLb ^Z6`)kY3N}=uPvoO͡l6e)BSiZ-1c0H=4+̈́\呹Vo_%o ] x_ 1|Rk :s+)3Q*T0.Tn?χQsA4|%<@ØR <09շ 񲫢C:e% PRj"SXC#w&@S Rpc)<D}$*2dQ ܆kM29_hd 48u mq,㚻LrTϥ!9Ϙ. PDG\1Tg(H{DQbb1J&ijb (w[PQHY"! Uh bid)#VP H,NIgƜn3[ɭgT.@aHG%d|Is][y p()'S KhܿP Rn0ڏϯo{lrոh %ľӷ[E59y`{S7-5KޒcQq)WNKJBT؜Phed %AkO)1m0(2JR9!qzQTHƱ`c)U1H+R6!HFbݹqbXXL2BY E >xKtA~{ӫ_+7v8_Oj J#yA0<Q[ZFS 'א=12lr'!(|J64`d1tӈRݹxT̮v1UaV=j vă W A12em0HΌF0ē\"ϰ}Fy$E0 w:Ws!Ԣ= &D%I"qD )gnF:b a܏R+XL>ED]u="bR EFr Z 2,ДTZ+AE-gzAԍNM򠀥3G5JIk)n}K[-՝)jtY KbR'.gn.wI| =*Z$sQ%rL(gSo !Q-aTx\ !3 y5(|Z6Np3y?*UqFyhz;TB_sU.fgZ|K`]P3;B(S####$$+q"#Gc:lR:*IzF5HBbQf}ܝQq߯S[o-躱YcQөlj8D$y2J#O!A4E!H*긊Wz`_H!^L};*^t~ߥFw-ZRw-uWR*j^ìr q5_Ws('MI]Fp*[e5=bqV G ${lfI<͸H)~`?0҇=Ph`z5y=ZM? _ kǢu'k~p7i򾵛?jT׍{OW~nƼT--VWߙ]LF9r7~;fձJ޻eW(>AmR-=vq寿IP&sjFƆTg.eH2UL3[bfI65|5;r$ޜdraRO"Gí{1}ҹ667HS2i[f3{ t4Ab9mA2x,u/#d3di?Ǩ@K{gr t)1Gi>h`r Pٯc'$t[|dYU^8+C#*3T⬾v^,?΋㛫'[BlGeKhFeo6W-R5ɻ-6ÏF9Yj`ҌK փɱͬQL2r$A|t^{pVRNp{94g5-4KxZ*=;|`'Iͧkr0-\G[ZJk_i}+MzƈY.a~: >?r< 㛡5,2 ljQT $E*b҄Oti=])'/9vd YzO.mbxP̦` ߊv\rE#BNwZʢ")\O҆VăZSsOhekZ\Gן)j2#ؼbed׉aPaxuwUVtZ2jW.DviںQ߿^>*;JL`.~>!U F89dAY?oo߼ aqxkB# tj1+UGL+Um)#A%\i kgV|L0\nKRUat_e:ű$Twº:TjErfRЩtKBi$ai ԩ1fUIՂ<_s]0v#[ YW"[ƶC'Rjg*&0jVڊ RN=]1ȹRf OyzkmW+kbag[h'% Ewwad6'8yAY$辬ċ,K"Fŵf|:R"~1ztA?h\{Yp0[d5 Wܲ "Tgoxy^ oSMJI,b#|wBNLNiʬn7;ܝwm5nŷ<,06Ɉ} >av\߽VQ,> Px bs-V ),RZc1~y=/fKzHxP R;zcw 1kE#GVFXO{cf0>[ETp%B#5{k[42)E6L{ ޮWXhAvЫvG2јosQ$sgbV,qsrȾNp|fǝqP5py)"ؘ-:+b ;9噜^ N<9GGߩW!mdž;M)]j1hIWD;j-( dJ ݺ"ؽ:ƶ2nG '+Տ.֝ǖ%a1ʝ3^Ul'ጏ;LnR;u8&p[ 8O18'J&ks,!k\Ǫ bU Hm6,,>j bk-=cL) Q*aM21lE୺<>ӪIupnFX k#Y$ ]5Y- +jˁT>嫁IwVAMʹ M|6.2 62)f] pE3‚˴or < 9c6y\aI*Y6֧l;dZ/g&Don2{%qL&6}Hھ&Svx8֒2Z@;/^$X8fP ԇt YvA3^U/IgaJG֋MVoC׋jM({Ol:c}c}I`+nw12L~Ĩ[]rL^$IGEGK|;.KYj&4c{Ol:W˽'u+nKlZν$;ߋ.;Mw-a v{I쫋KȬM{OfJbk0c!W"o7fuL&&{([YѺ+1E -k9Xyg*,}Vv1Yc=EVrc ՟zSjjafac-2I]R^1[:fp܇u3xIY>+d9V_;|JSվ1@[3KhVC /bPJuΠ/Ɋ>ވU3IHMX:f>I12;0ϋ0źV:c,ꗉN?항RYefEd%d{'3-bGLGenn☷H Tժ̎זup2%&ԭW79ܽRRp<^p*\i5J&v %E"! e(T*pd+y< oBɶO7 AKeB3Ӏ<4j ~>|;{noF"8<Ā<ݙث;X߳9`65w\!=sWOs X4n[/OLUltYWqdNO;xXL+t;fƶ Fcʼn `Y|Ϭ8홚b(IP+DK 1zuRQQX[ш?lufh9?'il1ݙ.QskcbdI7[>}XCZC n]I8s&v0ߙ/%;^*KyŝP/!Fj7OhbiƭZ]:ՋH*>xR@;r¨;KcZnICgЛd) A'c}B1Jv2]fl1t'AŏzVX[*0E>i<]L0} o6onqقwJ_̦㪡w[A[L(jXFΖqMyZ\G#2m6Z_)t htIڐ6!1ti4~z }b,3u0]e8Ere evN#˸ .ŧ<5T.~nlUŒxW~.K#`@,-pĒXHLR\wad6g[qEpu|0,Nl6e":dh>F"|0/:o`5jV[Oٶ uW懖Q?ٞ]+$Ug<S#- oQ ܥrxv) { qCvK3P3K(ن(N! A]Zf8UV(2cR%OQ`1-'0WF OF +WF~_z@:N#3%Z&44I *QY8 ( m:/*D"N )1eNUZž\6&#txk[ަB),9-Icb Q0S4,2%YB8xKy k&S$WIJRiQU ű*)MOrSs#3ш$G)lՁqH  0$KYbp^sgIBj2VDIC~ֻ$|0c$ca 2%8"%LB X2̮RN4BܱyG'Z {+^3Ilt:HԌ*jBh> Kq76jrs3ɗ5~1 nPVBrL&NqDQu8VWErʑ<(R`Dtesq҈ ~|ATb|jK̢rh: f"oab2XzHrB$U(kO"-jьɯ(%D!#(PLj%}b[Tj-t<TqT}z.x4!ҵy"oя<#L7/v!cB3~x~&{dʹ-Ň0y ,12pynM/KNkm%Wɂc 2",2 (EYޒe}HYJrEEJ#OmKF>'σةa.O.8 ?]CP\:>$TRہMpxY=z>49 =SjNCrF% =1 K  vx0رVwt#$BbY|[x]0f1ObrXѕ  o)Keaq{u*kяO%0 ׌I,fQsۙԤ@ -d7Zf}hkH/-fdF ,޸-$|+Rbzp9.̄8MfqE`KwlBI&xZ@,gnD%6w,FuR*f5Z&w[tޥBf{ǷgZ5w\-lT7ծ/O[)NHOP&'E(gN͕U`i"һzx]@Q A2{+(@! _І鞆xdWvt-7j 8Uσ$Vn#uDKSY<)~aWo+N""VTh=ܽ9=<-4y", k-=I)/ Y Þc؁Ks(PwtlL︆\@K">yFj.WMoўA+ƒij$u~N`.dh=mZ.&Ig2+[n+o%y.Y)\.,VoI&5;:_.S$2i |iT`0˜^WZ&%Geu,lTs+VJ0`TS"LAj/h2Ny%Xpqˆ* Re gnu/[1Jaݹ >r $UcqLx#2>Da vm{6=⸌&f T?3P`/4?VN'}K bßW@t2?Gmo? 6_߫q/®zU/4>|p=0Th9괱jhV/a1kG_{&h@gCfS-7AA kYK`"1VTH&-8|oh,ъV#a ʘ 1vx> qjPq᪚֠ j1 :_@1f.+_^G@Ⅵk]3?,F(86I$=t[b1"VpY ߶v 3dzbxjHl)ۦN5@W 62^įÃ`lRqfxc0 {s* -šy**0Xc& bkůd*(d?x%^F٠ޝ)]h chP P>=+quĭ*j*51ӲZDtcG r}B#y0FQv4&!(ֵD@b^ VK n9Ö:9wZU3pw6Fhu.0w/+;rʯzL Vmk[qǸH+YQBֿM NGV&!G:w3 ; ˞dӘ#]]a\f5ʻ\fL?^. qE!46B8*Ic{KUBXOj%=[`;H|`JRj3Ÿ8<%sJGs_ֺ>Bk+y@t;;ٳhcq뉜|߱Ə'dZV+N*̥'FEX~[y mg\D֙zwp)V2WtMK-y.2"%ʋy](WeR3:Gdwϟ]Ey~ ORp8nK_mOSgڨN-i](i3-yr^x.UK*>~vHND-g!,=#K{lLHqq]U)Խ}ٮdO/l29#>'Zbo7Xe ~m "*mI)fjZ3s:J`  (6ha5tLK55xu2Vv b1`xkbY軋U"%/u5UzTWWoGWslGH۵avjBRs}YE+i.qrZ ^0:4FdͲ^5U[~ʁ+BMA*Fk~{hS=$Tdکy _`=O48 Пߧf끁L5k{ZM?WkRNꎛ3S #)>hj՟WԳ<XƬ3) UC։xsTsu12)dURpu7 gcl2H+ G* PP*!4 ϓghP)tRrň%+ BN~B`cnn "z/R9Τ X:pa0gJ{R4?NyItIdY* FƧ&"#g. &p_ee?PpӍSL8jطׅI:GD]l}3)daNzBd8q*\RLY.}/ב8 Bsp1_Yn?$"3Y|ERm0)Ä1;0x"Mw+"Dlno?_Ob)4Fz=S l-g!$ׂAbΠXs;89ˤ%by]*w!|8e-10嬈f:p2!ɓRH1i=s.P'wK!{ӥuEjkw|eT0.Hov<'i 7-O Fz{~*SS"ͫެoH%bSꍝvzNfvڮ)a5 )Qja/֚3n3k6)0̣q '4xJ/])2rÝWIevfZ   娈51?fm)P.;jGC႖ {]!rFSIY&D1Q h?+5#4Bᄑ`W뤍BHϙ'DVhfsY_GoI)ĥ ".ӎNhp[0܂rv_F/'%Щ$[C:Tdi׋R⃎5~NA@L^H+ cw bìu<bk p(vkMΝڮ52m3bVDQgbÍ*]=k*% ^F%< d81:)% T&i#YS4E@oiLfl̚8~fF|t(;ָo0iea;tTRaX[4@]:6 th2Zkd7Dg4>x8D2̦cD1k56Ո*>nqw}uqF~2Dj (缅 a] k0ğ p6f;u[Z!zi؞lC})uׄnBs6$g!q"uB@XJ2Ax{P̱g:!ba&*+uDb^mxbФ&&%ፀ˿?`r1ڗĵ6D#"aز(녈j072=>2fvs&=/C 1]g>RZkN3:PiP1Uةi>(p޳r6PC ~NϚs0Z;Xճwqcv޲ݺxιcF ޵6+~y8gm`X,DRLIlv;'m.Y{$IwKbź|q5297yrrE JK/3JuLWKv;,*P,-c4{$bU3 9gC6t9t[B#+y\ UL;C/_h DE1Ai/1[W/2Cβ<[}pנ2ʚYJ6ĺ&dĸJ}iՓG)y(deG)&!L16rT6MZiN_J@iB'y ( iW)g~+HF/w"`^rWQTtA]angn M1,P{, sepQBԈ]v2~Z=)zr3}o5rVrLB1MPtevrP:ިU"2t%[|WA`&EQǝȸhwO."zfF*2r,#(Ъa|:P5L]!#Oipª!A_@Å^TfeC cmF -$.*Î %Jr0}5`8y-S:ͬ-F [Wng$z1?}LvLmdzxc }D ,DN:315dHUש2! הȮi_)q럂4%d Nr|97HIզtK<%iEͦo(]'5̔bKCԝ`)Qut7EvK3Q_+%3LPa _QD$@W ʹuVPt^F}LvRyqKY,&=$O-@ qR PnxqmН.e8ZQD+*ǐ̴qV2 Q%EBJH3^P*13>9_=kd΃<b̐_x_cm 8>Q ->?`K#- rK4`l89V*dUg-3#b }ͷZpq֣>7VU*we^)!I=ZTn89zJJ/i:U`p$WITłD?]M䴸O/Aڲ 2Ua){Ĺ@,*aY)o}LIGg,_P|}y մC 3/ꅥ/u! (=5>ujwLK11Y=ր%cxr9D:/C _gQC#\Ԋyv\NjmRt323aXE%;<  2]39[oP4P@Ly;!X2shprr}li JlcRc8Fb5μE׮+ {#eXezd>;81 !c0}YVmH 9 <ˍAuqpA=PrzD u 2ֲfWr=glu:`O@ϕm0j_Z$d$j75:j7:] ˋ@P<푑dS\k \2$E"ޢL074\X%[z^~SqるU-q)'Wn6Y}t5"Vg͞uÕ1#ul/gO+Ѱo.t'C8xs&PFsVh # /퐜GR:n2)!i/9ipCG9Ftڛy 2~_*^bBQ)p 4a(Y|Z\??i[5RymQYmnm.˚HNE[hf:4=m;;0FoX!}2134p4\,71F|x\L&u5ԺKiwO>>Bђ|gܱ7Gu.BGQRݱ\ajl䰒ZF:M8~5XRZK~fC6t c8i]Mc(#Թ=(~ά/mPVٶk:jYe-(Ew) *7L비S` ~*  rz0HsȞgEEzDW#[d`E9A迢w{Ka +t^â3؃lnOHwBl^7d&Z|x*p?7lmb[\6y1n7YN?J7$N EQ^=Օ]O@y z=0?E|"s"cn||3E5JO#=uQ~]8t6u p jN. rH 5{\ri8 kq±U=+JAȬ~zwRxÄ*("Z}TETmrm-;pJQz":- 4 /(F[oL:Om -UKdS[=Pmzǣu-kϜDOWGE/G m@TfJtO@5Hc! k1)q}]SnuQ+^賓^@(8*]"@cçS}p5u3diL;2iu7}Mpa> 9wF薛&o`hSy7yz8OLo q's7kJe1QUy(Z4ZLo Qޘ\>Jk%h?_aLJ}㗨,H!|&F]D827VcJz+zOnހ!>6ӧKa9Oeoåi.q?wqnD2^Cѣ*^na+4xM4kC~S&:ń\ScdS]C퇛7K(MXΆ֯K%⢠GL ZYu;oRH<+^,_=OjMKk 1N#|T6+N ;$` %d|ՒRh9sR[?n-..d ȱƸD֪ U^k!l5S퐮H1gjlx #q/eHI>+l&bAY[|hѦ34&6 H:x{DT3p0υhN|8`1~T?cEIWxɷ -Hȃ xgh=hfz-[Ayp4LT,^U:@(`q +؃*;\n7>4dq0Cq"Dek|ܻҢ +WN 'z{[,Dž5sD_Nă'd |Z|6izR}[m椧Jxe*d*RA21@\Q,4ґ!9 ƄqII8fBcq)^(:U 6vt;mu{OsɐpDn2G &hoODFlK@Fe4Q4=/wG(+!c*c]y|(M$sh8ge4gER]Lax1e 9a#k)Ot&wDrec=GaRn/Ô( M]p_'ʬfh@:ޑR%{/!Ez X: h27nZytX,P!\#(2vfJ:vP5[}EV[P4UgebPs,<%,Ym*xNTn02^pNUBA|;["9.ni&rY=CJ&.CM,sVn6|\L&$d3`׫\y4jR6~ہ $48j-(!c'mpLzbh}#5jC*y**Af8=wQ,!ccSHlLQA^|Pe$* o"j2!՗{{Vs@("t ;Ryĵ6k$z=w_B. Q",kd s)]y>d|,)ZDP`-d6kt?в7xw8Mq:o='0 0-7i0.V3:PʚYr2qdg-KĈA#W|ktU@*L J(s0LR}eWKY#>4'd 9~8]w:x(mP^eʌV_#J;veihͧqY]ԲjĴjH6w5Jt -!c~[TYWm$%mF| ^C0)ѢH3z&9CpI]DUU#\ .'p?\ wm䦏o\B5iza3u)n8ة2FtŶRHYiXΝy9~g)EBûڒ{̖pbǦ^DMWdߞtp9)^ NupWL[|^ҌH"P.} 2-E=t1˿+mmh&wlyogRq㈖,~\9,T5=,\0l /ĄKLٍRxO]G0͈Kח)̌1/qxN Dnc608b8<:ȨbQsWZ %l]i{57T0{`l>x㘮dNԉ cb=0}'Cm $ *fsp Dmͼ\|%Ӕ2ag߾E4ghx4'(JYD9S g-S*8뛞3g¨aIkLYOU/,)`Ńp_ܮT)}RqAJ;h:|hr{d~r`g: (RSHCΜ@q- Uf%(V*um9R:*{9:'QKȢn2JiXHۃ,ј hΕĐHl/0kʩ أ)62[7uڣ=\ڵtqA]&>{Ʌa'?=o>04fuj/Hj`> q=L3b ;~hh= z2镾6p-|9x~߿^\IO.a=hF.#H i~=,K}6.x4Z>c=D?ISGGV@}MH!tk&, 0!pQw5k\u 6α&TMk(; .!2՟5tBv(;e9!+f'Y7O]Lzˈ&,/r LAY*~K Rʅ*!Ia_L'U5Ɠ*'acX Q)˟MUݗW+#X G2q*AA/-(rqW21.0 O?Lr0Mc~=voN9E^IShTq|x)cn1XKN?׬n9D5=zMZëA5':`Qux.aNl"ҿr0*̍$pJ !x{l"MW k*&l*Ene3Ԯj'm~]2Nˤ]E>VA<öKW":nՒ'"GZ9;C9nXwM_/Q7Lt>p9)\DgVe]7i¿& 3kucOmIwɿڔE~͖th<b?F?OK|z7ja\ GIT"̐ ZPj0H?O>c(9{9Оܕ1Ņ݄dXFSPn &LI\P3T!DQ+pXӔ/e7!VBǷ{)X9-Q?&͢0ڟZ:9]/\u_ L";imr)1 eĿJ$^T4 g,I'Y(x! FgsSACTAKd20,XǦ< S7J+vvvo4|Q+yvYЗ< Nqs^ Yiy-pn_۽URhm^lwpkyMv9}p` ȵdc-k"84VMјj̔acjx`xl\ /+T@U3#O!^~M'wwq2%U"s>XϨiEo]ZvTEZNU3G喽6{#[ǝtw/֖,5ާWF+\xȚܬYr;pu8^?e\E`ON[G`/(]uPC1ކTdqP%yswx([̽Wne?seS;Ll5>X¨Va&^dw6oQKz yV:qȦoּief1eFmڟ ˌ:;bL;~G h0C .8Y"x9RfawAWbވs v٣h=Kbc"`%Q;Q!L ,_'%2a-qMqrT\=p 5wȝ?3X=ܥ\Xnԕb+~zd kpQ >Y]U}BST3@dR3TMnq4;}_4##f {*gU>.KTfCz NYy6*,^U%),',-X?{D/n+K]7j;TsxgX!yq_&*SSXϞ^WukR 0x Փyh  Fyw_7svgI9C8߲$A($YQՖ/|!YQ^ 18ibچ>rDKRjsmiXDì7USnrOH)E:/j"6(f)G\khg.?XDkOL)\6*q +=/F|z3)֗~w\R p&ؒ 榌Nh Nj S4T`zH,lOKW_嗫/Zt hI?$`Hc>4lf+v}ViEehW܏ǩr’I.  7Hd2!iZ1*4j @O#i.0P,~9ЇW* !D a ,7E%sZCZa@!wTD X"d7\X]kLƣI ޖ2PB4>K\dڡZN*sDfO9DqxN 3#vML{fYʌVT }YۅP{0 \Eފ&&H€N[lF$ p ubTݍW2p˸/Dsr~|-C_|csڶ! ,#CX>xv"OYBD}?ZGO!JZ`lg%x)e+t>]w563=h8*kH 9|%ŐCQ}Me՞d0XE!YuL$HiHKBǹ,cNz j k'8̠2{~}\5- P{0cSuP۝c0ZvDB; ҹC%'8P %*pVRǑ7vaF&m׃;bJs{; brX28A4ANXv: q"Sm 5sB;9`{-4Z*p ( NGXd gޕ.`5v$YZMej\`7 ^t¡ 9:m:Ծ:V:3^r3)tZ]Zrup'|^˻ǢH(Q5@$KT:6 ՐsK+YJd ))1覺FuZs>Xh>rZ7.2N{2?/?DS=lFGz!qh:$fm$d-;$g 2E̐)joBp6!{& 8Kjُ0[]X 4{bO_fӛ k%b/IݬMʿe׺`k]0.jLڕi-ӜTP CsƉZ*$yNb]$aHAb1/s__@*?^58-_K&0V݇Krl}o=X|0~v}K@GTPN |!=$fս*=H&L%7rN-]Z`.[ta$ty :Hg8)xr)N/ bHkmE&;n/0۝  $y6HnYvK$K7*m5ƶ<i` ʻ-ќ[#B|>X4GL X=$Iɼŷ{IE+&~ͲH\ηt1Zgp iPc.^Ҩ@NV-O+G#GPb VSZauꄹpH}] u',T|v5jRꄝ^2$,A @8C+%\ZpUQkiZvM TpݤhCЖ\IASkv _`fXtI*xAf`_ِ%4jK0>J EL^;>boFwRm~?}t._͎*xONO~w^_-ŬOzNjP)[H6Q o BxLYYЊJZb'A]:&+]I6T DW Vfu'Q2onw4C(ڊ7aܯk'7OC;)7=Ʃo٠p(#J*! WPD%O} VeYmC~w(!̵@FhuwݖGx<{S&^#j7R+}"ofAvX*`M )gJ"@00iQkIc;Y]"$DsWI#=V> Rm?{Ae{he @Ԫ8;oD|CiXqɛNn>Ml+EVhT!У"D̀qmٶoů[ @SS#sv/n׺xфhNUПPíPkZcmn')+Zvj9p꜁jI Km~er-$fV!W / J-5aj;G֒C[PƴC mXZ=W26?NmdMn&<A١ѾJp6It;能/?`W;;oÍ~2[SB[n_E!fMUYYZJh0C+=J. Z_zՇn=!CWy nOP͹UUS'_op?uz\PZxG%)ކkBֈ2Jj莬Y)U*+{ǬxȷD H!U\ EiACA;Bu>ՇydH! e^Zn#pN6cL}Xb&Fb,Bo Y4+Q_XSxҶ,e)M.bɉ!a׎r#co|-;(EHFŏ#i#s=40M07 y¤|ӹy? !Hش9:T)b`p^Wa'Yx`GDpXI(-7wF |$1LQ>]/_6/߿Y:K?s<:e~=RG9'a3)bU_|pO S`Nn` 6c]dA M ^*Ga064-RI+ '1ѩM/'?K`>SW&srQP4i@څAT_8a4O}WVldj1=FWq =xs#Ѕ4eTaϘl @9){hÁ!s 0PJi]y sTsaL l3rcLq{etuم| T{ʦÐuDh:CI8'nk~,/7'IqhEwtՌJ&(X6԰#Fy]M ;;a>sͻ֡rDǂDy[}B}f$&?*5 S]!@{C/+d+91G< f-u CAvu1ת(msvDNf&w8Ds.(VoYInFFAI ƝCͺTnUN3ZN2@- ˻D/Ǹ!:뗇5DSrmDyJRGtxs &@Hs+n4U+w*ER!\^jx 1&o[dfbrܝ&, Oui䩬 `IygnKo:ms!OȖ]:>kQ+jNfĎ&kh95 ,]iQDr 0]_ْg?ͼeBHH,;P&8ߺS >+j yF"?3-f|o@c߾\Sy_ rʪq5恠É1PsҘqT0)7_-萙Z@T\{4uwv6cmqiFrjXpEcxZZKNp+,$Q*S1c=.% ;[bnb4pVI)IBDw<{Tk?X*aBnGbҋJ8q %8!*+S亢p+ծUsa9oǚ֊]\~8&EHDv:nA?5iC#GiLpqLA0"3% l4N`4T~KF3Þw8ՙPf6cO1ͻ+<V~#0 ,lvMJSu쮡sa?-:UpzaWRUy, ZV^Z P}O. . L֞z h#|-ӅaAAo T) p0ѡ/< RvviE1t[{?+`YBSԌu˺@LWk`Xpt@p!Uq=q&eiޕƑ$BCIyđA@{/ bf}A)jDRѬ wG6fwjҒX!YUq}~ܛ c[Fx !Q)s;_>Y|$0RjZUX|/yv n8g}gUcny cXA9d|}S-TVB',``C!أI]pn9d1 _N@L, udPZ^0Tr!U`M*QrR5.\Zơmb>B#Bm6hB $z O! '5 1쎾O.,\(]P6mCdou]SRGs6P{WKz*h('] ;.YrBwZvm8ݛ(UTsBMXptbHuR DV72pSmMLٻRkik6#dkAA򨋡vm-~&{4.Ѵ]p!SSA2&d+zS0; yoK XZ;O\hcI 4 `K޵q4J 04SzỸ4B׵l RRRxrElv`Ô[Sb&Ad7`FPvG󛳿[@^Δ j"ܶ*ٹ!v!cO>VxHZAQwsmb !Hw ` NjPXQni]jCpMJ( (v_X_6-+Q`4r%!RG]M5u<kDr\J͏U$CE=,1ٟ@CoqCS@<5E8zH¨4ںZɔM>萰Jl,) `S,]D_|Q4  V{Q/w'61rphwsI? B7"axOb̐JRqjqjT*/H ̖}Ua'R *5')9;N Xܰ G$ݵ:ޢɂ3vCaPϲ5?DtK_KSv\S`kH) ˀ\J1%(NՊ:ɵ1,b0<1,:6W$ӍJƯ/]zޙz1aF:|yV>\^|xBQׯy,@ͿL/xw?-3˟|qKGsg] ڐ>Cd&m6+p-ez';n{Rlwkכ_z~y+q2϶4ۯ3҉'?ePeg8cfFbўכg>&[H<'A0^WI8svv':7¸Ϲ( ,tFPoʼ_9^s%ߗұƚ:㋣Mҽ.YzCv:ihd:]S]foijv,#?AhAPN9H9}>]YQ**uC %dh)0iFՉ9p4)F5:fWQxaЏŔ*CH:&S t|bPޑ~^f͞kVa9+\q 6[WX+XwUF`ٖ=0mڥвWgo_.fi~~wq~P%+>Z0y dC$0J3(\2\loM1H0yp*mWΞ*(IAks⎱/^)'HZRR6Q"c48Ona4WBVSw2f*}dS=aYU{lJA-}!h]FyY= WDP~D2 y,QÜAUSJ LP: xMw{kN9EOXY%mׁSp.(rxvrsW[ e >@/=>ړsd =~5 !13 ~p &s}=d䜑OcSy/ݬ]Jx|sbmT>95\]FfokT] j|zvT5$KȢ2尥!0u|K >٢ɒZ<%ܤՔS %o b  Z] R(ɒ/,v`Nj:b]Gm(!Soe9]iTA-9S4g~Ǟvٸ kIY.;3zXiYڮ*gVfq}ލ,jӬ\oo,ll?}woտ;QO>m̟PuN}j1<>Vj׫vs0qsFT[aUs5OS)~̥nYrwAe;cwud60 ~ U75w\L"I]1M.^T. k8``7[Jx#@sWT[ rP6,f^B%x[1+'8`K8]W$MCBg|P !Om ((LgYB0fVTcR/W"L]'@Z:Vu6+ D?81b{ TY#j;JxzbiDTDܵ~*Zfr~mjɞJ_:ze Ͷq-s]PY[f QӚWƒ _tfH_FK(d9$@|S/ql< _k=Y!{~gEV7t͜3Fce ~ /c\il9 )䖰}Q"l+zcmٖic&Q ,to m JYrS[ֱ[5keh*:Sl1ƘH1 )l3WP?m!MTBf@>,n1Au( "^ImVS,ޯPթP1f *Q_ *n5fljdjʆ,4[Q-dsX$?9]JjRBq1wiw>lvћPEM0m2M/>βH PqҪ#/(j2uF5 bpGPV'6<}9qwˏcwUg .Tmjwn{a5zL6B\/ϻe{ ?VPr%V8V4UMA&E+JDToSF" R7ɺ\!줲PUO\?h}uBJo٬mzmnOUfdJ8YԙQ8&a )3%sQ}>V.꾇Z%mQ] 0׾))ͭ!gMV(SMjXmdꯜEgKb)d Iix ωyΜ%5Ab=? V):#Zԫ-%}YS۳cSlfbbIiPpMJ̾0J.TI\ZȓܤFNCp"H+, ՍPBt>ixm(C"tJ=?@'yYܼ^aDQrE5s^a@S[8`Q*%LtoX6E}V7]]oǎ+*db_Lr(-/)hLG dMbT!sbv zT`YX6o¥ʘ??-ethS{oNb`яXboUI1 sRޠ+;PC1<\L_MCI^~zI軆;I*5gLw>>K'56t6I4ѐ6/h0QG\|rOmN){&PX, Z*ېwh'z`EYl!uiI1g ӄSZJIP{0~IJf$5xYeR h+]Y&5 8\)o׫?B/2~2iiϓ՜cpfS)q{!!ix`֨I7i]Sdgb*NIv$}Jg辴ܬB)%꒖ʡ5,>1J])tlZQ@iB:UB۫u$M7z 1ƒ 5^H9L3!Z$Bמ)0{6*-[#T?퀘,/bJrS0ϝEb5}N?chX klxm`ûo~z2៯CR.joG{Ѯn0nu>qotkLΔF }">}j?]B|np~J7^W7 p51exݼc`cz0*Ǿle/lŨ 18xSΒ 8>)18qG~*0lN7;l%;Mm.s*!Ľ =q,p` Ê Ow/& b(B1rd$b]Oy軐Wv}2ӽqXoCRB$)$-+% S-W] 恾R4K)@GaHiu 85:bS6ܳsYG~RE#EVk^ )|)"% v|8{`©ȧD>T}\r<,,|Z̄{g;.WENHn9йʒANr$fōӻIIp9Js@6)Px s42&lB!/K&MabbkͶD/ضz*lC'Se.jX"Sh0cӏFg8-z.V=8Gk^LhDQ4S.w(\ygɥ{f\)Ͽ4Xvౕ:&CQ ?Ru5Mq>GSŰʁ&JΖH9J\"[ۤ2r9Ik^R}k)C!Bx.JV,Ie&F9\[hE} x~T䛟T#n N}3Ss%GqsK4}&m8d'+>TfELR|'S0oV 6}>ZKUe䳂/3`k,E9Ⱃ6KzgLLWSʃ|OsZ!j^Q6) Hz%I%-SSwfy: ӈ;&HN{T QD{s`q)0`~qrKN(DFØE:xJ~ ~R5F0y4Cv{xAcЃ"aЧ/ŧJ%wMF]Wo(iVPzK#Ⰷ 42Ub1a}tȠ&fHG8m(^dȀ94G\%Bָdg%YԄ1-ǘ=0Bny9lJC9S…e|% ӀVP&%vlcTQjۣ,0y]>*_Gxԍɧ=lj=;;!C} iџi$6v}2W"tXcd#dRҊǮwS>qsjO"pɆ6~}sV1BsFUR RciX^Oz^.-q3y3T"q^`oDJR_',A<:D*l4rQY|x+Olkܻ /O<ǯ L/s{wUoS!R&ꍧJꪞCb# 6Cb0BO159a EL5Uoj~m?P6wf#譇mNM9b`[M}Vo i=Yh?~v, XMsTƆIGl-T*MXBS*v'{F r KRx++0gg}{͋)0$`ĆZH;K>|5>/k9·w ZFØ!GBMaB|Pl)=z4ꗱz¸+~$UI`ID;K>l 9'O /&Ģo3[aǮk^o6jZ([w|xϘƷ8M[w@b1 tk^hKꬼ%yw9"`:϶_gm/_ ݛq¶7~ޔ9&߷RZ QlO1&W`D7kT,}>*\sTeG-66/L1)A&vk--GC>ǀny9NJ|ӂL]i*e,pOhkל">o#eRt/J>X-k="PCfBr.:n=HcTO,ިX| b•kb4;ezx!6jfrʲñr`vҟ8ngk5xk',zhtw~-~G*O~yb ɎmRHʆZ ~$][MP}Qz $6bJ"a*Ha6>ƯFgi1k-#k0BJ) ܫ;b*9z<&Om.K_\AIg&v -.ɻf*pf#JZ uuΩ6"VF۬T.~-m J+) YZk|"Cx>@~C#m6loS7.uxsߞ=RH.)&YND@˧qBOZDrZ)2-Bgqsk‰6}76@<): A vM]7KuG.ܔe{hJuUێKs몦i%)mKc013E鑕(HpA ϧ dOoxNrh+f^`;N;;.2gS$i3]v}6fG}Ÿ%-TAѧ]֒CU>O*c鷋_7BSEĎ87/ZȕJCqW L﫾B9v½Zz,`Av:愣TQ,6EBY"T!P1|!@;)6vm.'fq*~}o9 eZ C.;%I4z#8f>';zqgFQ{sњƃ윘 A^nqG6XnuV,GQg3'T?څn>_r$D Ppskȇ<Z4Ts >*=a耏)S_;K^)& 9{׭n.uQ9$U10V! FM<'|][o+~_CY7yvayȱ-ʲ4i)"[h8fX}bռyS:e-?–{?PdUezKÛ}Wsѐq2&n|2L|hfxE8}~Wa?1P M!$Ѽ(h|UXTsq]zP[ &KlRG>c^+\S k8.؟PթH~s?֣dT*iSLf=4JTeҠxM=3 9;+UdȎ)oCSVH1.l3x挱'VSl8R%*den%<Fe1FV>W'a9q9emA9mYa ֐BQTKe$YȜ#Gk%jb2Ƈ#{pX臙Gک~g@n"ы妈xň!1M9H *K@kJsϤq) R% :lJn2@6Tz}hcuN#qC8B.$9e"e3npÖCÍ[@M*;'#vn̛71E)()vOB pآ[FӇ;ߐ#Av8nli+̜8nڠfԷlIPQn(L }dEA lbؠݣv䶠ݞ(`́}=Vgv77ٹAo1n @;1<㹣6hw ^J63 HK2o$3 ۬z?} ,_B9Zv>-䱒ɖ:u q lA0'J 78H-:Hu3GU&.m$o޾ơv)A;{ 6Ob mf¸x ?~IZEpWޯjFaqN0B׆.k:ܠؾQ'W9" z3$ i.e!ac;I6 h!87|݁HegЁ' 47lO A1Hu&fuPrU]B$sysv1-qߦ8"%9&bP*Ps3=XF_-!P= Wv+ּ\f|]}Ku.WY(=udQ)ثOvpK.-A* TJ&೫#iulpk7nnH*zWW sGpBKq~6W,˄/~%NIE'=L}ۇEŞ)ߜ#  [4sXC:LP:V͂g5Dd;'6+'co?t= ڗ] Z|N҇prؽlSئ'[ iܴ|Ư"Ou7%Q7㸴97k!콃 I02ck^u$nn}ȬfEDÛ}gDWt#^*NPUsnUw,-j|U x ‡ yt$%|eG3dQv ܵmB ]6X"N Zz.ɾ-sTM1Sw7Zfgǖ[)K0kv8ޑгDP;xgn,x'vOy1Ѝy Qϱ"2srs?sj1'=yǵ:j=ˑ96y2ņ3u ̋]/R7ІZj/ ;FƌN2NM.L6.u=3t`sGP':8>rus4\GmO]$9̱ ºF˘CbÉ4ZzG*h9"h{:$*~QD Gd3sKE.9=q4l ^JYJ > [XsnTwAAAR}dFwoƴv ;4r[8*~lg''um|`3`,4PԻvNIHDv1cGJVQ:]X%4 Sb5}ߢ-fg |Cwr!ͦ^)%m> 0DNi ,BY 9Д41/{t\Z8^#9k`k ҽ`‘_UUT>9~:]5/A(,7Ew!(3mCg51ƻE$yL\e)=Xp6[⸵p6}hE!_zbt)ɹ( WhSv"zZNZ>UP0Z% -cKxoȽ~{䄃SLo:#r$99'Aw3H ;S;#3kt Vќ$qdpa;I ySuF8U$i;Io:8}EW^+-np9໩0*j :l(?"C^epB8C2t[S7j8!M&J︔\qSdBSV_ͽQFq[4FX f(#9cc3ֻ 3`vkcS[cc(FtvxsԩD w<|0Tn{c:j -:v_R_kimѱ9&w>lӂ՛8xo0|[6jcj\ύzwW<ُDl]_w4I!܈c"ֺL/hxJV#^7Q#G`8v_xs?={Wm{ֆ<+<$=s9Pg>.CwNڹ\x[DTeI5A߼M,S\m+<Ͻ( mnp{bӽOaPM_+l˖5beg83LKZM͞m/iev/ч+h$(~z1*.iN;|Gc<=+~ݟ>Hgg;Cl6m)7MwfV$@b).\$Eg24&3V4=ӎJ&d\yA84g\(|6EEC|1rB`QǀR u W\  tl`-|Fv,i7yAk,rS 32(rTɠ>"Te6\u5ҍ3 ]]EASRWQך:hiWX Abw ¡GK3H"}'3F .bA ]|]I/7ԆLX)E&_~ (e#ASI[i+ˊ׭ϲ곬4PeQ.ꇫWOӑۇ~N/Sv_fehe2pdQ2/67y7?sj4*?yi95mf,~M fkz0c΢ &Q%+.$GpQ?^$eH-*Iƅ%"v AZńSILyb3͜PJDX|關@ C1&Ę`h?W:BC[QF `h7MX)1ih#S2QI"u2OP. !^& Bg$l!{si B=@TÔڍ=5BвFfk*\렧=Fq z#'T[25Cg((=uXf~*% :k4&OI(2q-P3"fDA{̮8;KOFd@(! .^D50.u KoB>L}c]&EaO{ޭx2ݥ\vX޴XST-(o{wo[m|h;0CG/U8fRHBj0_7(拐_騟}}>Poe)ˣ&N'Yirθ1XbH։`ABB-Q-<&J8ʮ5-˫I ʡUQ5oz U ֪GĖy䴊(OOcdZ}WBY&$-ir.1-օ+"Zޒgb-ڀپf6%ͧwlf*: di_Jh8vFӧR+=XAg:pHB=)hΫj eD >oH#9XAy5`so ،kd*pIX)%< OD?BT{LiZxU"Lm- }ig\@>9<bwF8cHֱ!qzKG)M:K J ]*s&tPHW/z7u:3&ٔPan\IeGQ{H\L1хx @Pњ]%PA8Q JԷ#keJ0Joo*gŔhQ%R2' ׊-WhS3=o$|"s墄 $O4W6-AEX7h \$@rneoFjMJICcW¡A<2,4"Ǔ 1r1*Uj/a8ہVr?b9&;$xca&o8ق1 qpdohQ`Hh"]DW/*Bu@Ήұ=PP c6,حw. *eu%b~p޸(yڛ|Yl'1]IӝcC_#(w^.T*ԒzD3{U}|{=?Z2hQNZ4@&)SXnp/6!F pwO%TfsBAW/Pp5qª7)'ppC2ic!ȶ.TC>݀0pVq83 6%fY cȷQDsη.Yێs HFw\/kF^eG7 W&?-a6rH]a?>Y:ȟH҃&*;#Kf7Zd7(h`C&"H/vEjۖ,NW<čW@v\2$GFn]U!aD"SH QthbpZ ecVgMF(+(BĄ~0w'o#A6anYէ (Qq:IGŘ9O\RfF:7Aw)}?⓭2LZ㋡*?NŸ"8H/upU;p« @Hx~{)pcURpA/8rB@HԮE.څC{F_&uo tm/b4D"*4'Í{*KZu2?Lpێ'!ޠ[ݚ-/XhܻOs1w?_dZ(G_p)jC2kb t5hFds:5=WH&'+6ŀy ׏Sx(I3;DRaP_d~>{2_Kxp_tg1q:J;͞Ki twr0aCk v2yx1.iN;|EMyvh?}5ͨ(@~jMn1>fjjw@WDnsٝW(]YQ G W:d㙢&I%xlB?c;,%ΚϣA:TE_MFtD҅_DǶ_=~|ۯ^nFRǸxlj'֑f9%.F8S,(ӐPJX~59L%:mʅn ŸS$6 ֘=_ck5׬IaxA0 1 gc(`bdFf&&4sҥk6fm;PkֶoQtU`_;D8ɥ̩b*u Zʃ}MBH?vaԧe_RjQ'Ė2K$$BoWE`f\>Qe8HL$)ĢlZtDԱ0) 9ϷΙ'\frm3)],il1vpu(! H`fgBɌ_ IՒc$#5 cȍQjX(X*-B 2X#Y& ⵛf1_ͥ&[19XLz]2WJ/ZC%-w\x"|ً;Όl4Pe"BP˻άjVg3gB[}jYKQݥn&w(b;xb3=K>1ZHy?@xr{l\vɇ ZM_?6@?CC_̍@vcNE&شb"V;TQ(=eUc/*wV1E\S.g ×/>EݘmZ_CC"АȇDАRGq&<22'6M*k!FUH&1>$N[9蒴!j'zm>x۽hF?wc~H÷}_N,~ܚmEC.Ź<'c`6dT韂UJ:%lE:KEJn{kO7ܶXx 䩦M&v7"XdVD:2#"##"#;.'d%t,Ycɢ~,لy9 H0lb^bVui:Aԃ-ݮ((W䋹'?rjeķƟ&O^ɟnL+7ph̙?7,ęŨV r C`)2L0N ˧Hl݌yۿQSM ,łN2001.JIε R`K`{s *v@*ZN{[5k1>g(KHh= =K: n``hl Wl*lUᗲ| ̐(+mQ}ܨ%`k'岎sKI^i/str \GLWRQhʙWo5k3yOh)Hi"\,`Xlb *HzUm_XCz`'\zK|ZgTpC;no{%ZݶDijG?\zцN;_# Y@T3ֽ !%/bfI:ڛZ꧿Ws }jwgLBZu\օ(W'"G_Vk \l? Da{2c J>Օj@ $:Y3(֍HU/,rzɊ_G}KYh[ ^jqn}Nh6izR`NKVG\Mʥ5dκ)'Z9]XSZ F>1xUI ;kr!w北1L"xf;sY6.WI^Uxu+9nC;rz)(M!}I0r]KiN}~vsnrE2p^^uc/VHqd S6TP؂M 5&oXi٩JkSu|FьY},xɫS˸5{#.Z؇3jmˏKJ. ׼Qukt.||&&TQ.Z~vg ]u+;{ɲfNg:dƵ*ʓ$fon)'ngv$$"N{O,TӶWNu>_ղ-T2Hr_PTh.^RZyp,_X"x ugqi &0CŚw<&Ϩ)f/#nGr%;#g#/ڞHTywDjFe/INE`5"jx[yWE,'}Ͱ&}.^W7y{mPDH4M]˧gՇuͭ]uZn]wͽ)Una.fXfZ歾z\\=9JLr&8ڔĒJ}9xw[וeU9 4Dؗ=)H08T比n䪥*:0fOojp6JX띔UW78uSc3H!_B}`G5ktLf\S.E=cT!KA0 w7Xhrak*>h fj.&Ι;v66 '$hXAPT* j#jj1G,9  ]Ķ"[WLͣgS߻Ӓa!Y;ⴝ#\I`né6ĉah{FׯykB؍lsI%cf*d4M_* K8 Դ:kc9Z{j8a€-efyr0߭sG>jx8f{*:qzfHwUzW˝5-sᤍP^"Usr|! 2S d(4Ux%F&+6l+bT;$SRLdL-"QG-`1֎/$1-6{l#] 1/nj5t^j"QA41T[w{`6"Ʈs$ΰ]%'52֙PH ¤ZCR [J,`2 ՜E ` o7 ̞ioUF:5kRFrJYG\x5.WnĂ5F(Yb(}4\jA,Ht")$m'E*ᎬըҼ͓/G$ynE\5 _ \ iE⍤)RY`sm6!rvga%M/,QJ;+3Fb, 0`y 6PB "]0$(Ց q?^PӐƱE لvr%FgIåaC'NY Iq=Ҙa D$\ɴe2Me1EqezԪN-/Qp˂j榈^Z&Dk`!i9D l@^ B).8,84=B&g]J|s8LMS j1ᛳὫ7eqF~ww_u%OG3K6 ~9\|Ao0.G'׏կjϦחfӏ_<Gɖa0{sCWg`Fo&&w{&Dj4PI0{7oB=7p ^ 95~攎iX?(KX+jF GL`u '羟1׬R\8fP!c l+kV_OnV_4jhvj 0W\Õ}|?5Ń.Db##A`NJ5x"^jZkv gg5K1YaJxKfg "&hO)'}g$ǓWǖz54p^.*S8_6՚usڡR4.eJScpLzLvpgȣpnYYUN\=r!wFO c`yB ]s=7>,OnaMU0xF)F3 !%;~|"(,(I())IB*얂B \Y8~^rCLq'ݒ\KuwGMM9Ziy<0|OkQπX ?_I,:5ړ!^Xb"^SEEֆ(DjFu75ۓ~‡;q-FYg(,37bѦ@3cDУ=ROu&\=n\TM uj% 9|i#-_.LсD ޜ g1Il:kMè]">9DjS׷̘PgWֻ%"BiVHE.pQnR* NEFLQ9o&" )a1klQI Hh"E0*P(Ÿ#Ҽ2&f2ͽ݊i4LB-*#M i],*fT{P*GD`Ff/݁\ّ&oPjvdeU!* akDCʢ z:bl$NH C v)V\h‚$]ӻ9c2RIi`(5Ncߌ#>NJ@10SBU\!7vMS(rI $m˰LmBk\~&h,!J;Gt,iNfDItHA:X;ūp9\Q Wp킦˗/,#[0#a5&|ęTaqPxY  NCwMA vO%h z C/aY%[AKƘWۚ#]qAS5axg$Ǩ6nlD "(2RKpu6ʟ]-^$E('2VFfn]xb1w-mc$f[b= ".{.Gh+{\0'Pj, 116M]7'NFDs!2&J)ؽD*e,J_*kK_5 AAXTRJGk8U[ngSf!"õI`sS=]lNyQ[4l\H~5.pOI9_Q@9$Lv!TbM##1 u"BT! l}s6fZ0! ! C >pk+;+M)Vmif!U/<ܛ$&vHbqTEHScRA5_A~lb wߕ"Uт 'lvLfozY!d 8#[ Q9I'2W"MUJ*6ƻ~Xȳ,l8`UPis(QCwǼ8~y ya/?>RC^ ^L(_DqfΧYiV|ջbɤ?DYjIyu'ĄXX v@#᫯Rژ$MX͛>=y9^GW Ng7{3*e%f,3rI}}[yY]c)S)Q.i0Z.|43ѐKV1!_!!o1c(T %i"-%3BjC%Vر9`r"yl2B,/1񥛗z$/fow-eo8x?t%Ӎ%$_m2hSMш% /;l%7(&V!AsI$+ǚ ui]Za^aHs g+aVq}'!8L[[V(#0DDP#lH(x ?b6U̴C`$6Q JSKyJ5*!Jq0VNUZ\IG9M"vX$1@1DH!fTS0w> bOM庈TF榈@ Y1 ||(Z(~S+r.|sWןtN{7O?ΚÂ7q {3Hii?=1p3>gymZZʼn*$!#J_-T|j)>I~|UMU%/>$+@io/1t0#C J2T8*y( w}k| !Cqze&N2 {8"Ѳ/ߛ8`2l}FN#9Ky*drY}^:Ӓ6F YvP@ `%ā8b|fG d]|m:٥EwV)ha}4AtGL:k˟^>~~K eKEkeu <ko^/~o~>_US<6 LLm=`Zգ[q$/u##qĝ"V~cdҏFn7f\QŌb\ǖ!jE*MBs,0,Uϕ2!,U3>qj/NHI9`ͲUM܂ o`N @{ Pk2_%lf,prʡ!Z_2Ȟ*n-#^/p ss︤8fbp<u;=@7o6] &?M>u?FBKe*]Efg{!-&1icIz \Ib3wJU=gԤݫR} uGՊ"eOx IUBi"8 $jڬ<<7C2 |NFO^x3xWZ`? @{ˁy|orw3]'{Y"3%Wh|ϭ*VXH6N:+#?^ڏm?gFyF^ϟanvc3DIgeN:Ѩs90cij>!M|fWd.r俹v3casvxuCuvh8'Q?~^qAn]yyL&^'WĢ[L#mM{^7 {uO7{?t}y~/k>~.Qz5U_OƝg^V;%>nVτ^8sgO_0e߾_f2ߺ~fѰ1ٷK%Eӯnrf3 À[w۝~kWA?a|>=BLK`U1; #蛜r%f˲97'Z~WqE3<<]0xC<矜ː:|8.:f6@FhLXq` yW<O'oRb*1ne'VbK:-*~;lYsZAkDZ?t~4E6csss3}Y}GߥʮoՒo2pO1ٖ 0I1*HIHz}:hKbI&Gn8O$۷ ?@8>N8X 8V[MX 0 &dM}¦V:]d5$R(FQҔӔƋU&ům'KL57ŶiPB񵶔/O(_oli[|&\F[Ylc]2 3jg8qL) p6!T)7QŮO/=#^tW:M?sXKAL 7f9'Oc0G;K)Y GkTޟk[T&#70I/XD&Θ(5_I%(ᒷRTxp^h"Q`zEqA ?{BL2) 2e5d9({UhJ]զ 9'eb a0#Pl!R!Vz ՈSZ#N0' G%4@yJ(v.ُ| aM @&~8E2y4Tl V$`=-]*~UZV9~'BH) =¾niVm¸ a7W;{~ #;C<Q{G!Æ+]`l1I?9EkF7WiLD*" wD#(ũV;ʞ̶5*pHq"r`cG]q/i'eNУM{= Rڣfؕp>{5wQ2^hP} ͻ' AwuOX \G&n"ag%J]Qx-eP,<qw؋!6*vUs y (p sZAݴ:;8f?_{Ah=%j59 qHzڗP/,G1!ЮVBKL ]bMiq\dUVį\3RxcɊ{[?N)hR+7NrNNڭF̉2'Z]D[M}j rlʆ&U{"qΎL+!RبUFZDJcIk |ro귋5㴥0uZ0u{AX-NEi#Sn]̶/!y[bf-rCFdSDۓre/ݺݸ}=45ص-Dp+6nՃHuۈD7M+# M"G z^W$%6Zם%_׮6ZMk9Tbc)Vz*&B}*DYjIyu'rsX*A ;ĘD )mL E&B04\eX.o?_ΨIU忯5u(. ljI]"$62Y[㹌 ,*qd?j% \Kr]JCAjW3 q_o{d ˬ#POF0.@lm)m3C͸3=;_lggYA?Χzّg' ;}7>}+QHstM4ؖk@m izeO ΅* <~6]Ձj.?*כyғdVi0o]ZB@%gJ۶^ƾhӹy/&7MFF[8jIlGN,#|Vz,=kT+I.8i{*tYq"KwЊk:t&ԉyy1(F(Yc̶jY{W@4gܹa7Ӆq*DM+(g/"K-(ҏ7`鶚yҊp9{$ŏ!O΅B,l= Go_ ;޻6Aef TZa9k:/aCc3&4EБЉW|gxbx5j:@@F5i缛]=M~qwp~_kU{<}?簄 _^n*o3lwԜ}=i}7m^x:k_vL]O/oM%ox ksJ9§n^継ȟτsNM?0Woڝ/7p7<ԹS\̳kIVER2NSkatjCb^~QG|Z>:/9{?{iK:xq/ 7зN6_t>[ }Lٗ`krK30Q?}}`2<}m0 +ﲱ_n&_ B?F٨&O!%?Ԙ!. :}NjNyȚNOwoe؟ ?&QO0h^Y%ĬŰ};}7LY87g>ޥSfet:_Fozrqvxg1_Co@{Oy3{eNpg+r\:%F k}+ZfɏcLLT#886!%Q?6 scWm 3Bf갂.vg ~s(lLۀl8Ezn S&ԷWmAIFܓƦLpY KP;*GS: J)_;vyƝ5 u}[ȵQ?G!Yqԩ>ڝtƓ9@4gaێfk_X#Eފ3!?5/L0kÒ4',.7Q# c8_;~I/o9.DĶQ+VIB YyvG^Zq yX#5h9,r16F+kPy#&3[3 4p8}iP.n`LO!_jM-xqu;h]#ߥ/G wHU"t^%H8VRǢRtˎnG B3 ZV+jkk`!{(Cg2N(5HPiaw: VOՆ*Y=UŊj$[M0#fAV8H I[ i+t٨*v;LP,a>XjqP,0HS,/B05TѨٛ`.MNۦ(V. w,i}V#D(BЄD$bPBEc z)Wb(% y0ds`Di$pl_.(^邤d%|5}| ;p;GQh?CB_Si6[߸8 IBo:f4y?EIχn&X,~^pC4v+gW>}(% mE=.H D8LQ&"M7h'1fhwF +6!Twf[/.~  G3Br oFשb 5Kz)D"TLaIX@ap%jw DV2TT-pvSQHjZTs|8D1u8!N,G'*G'JA ē .'jpvj?8 "34 i)%($Uٿ C$J-~w!jɱ@#PAnq&ʩ v`p.y zndȩ="잌~uY΁b'\/%tr \!Khf(xmofx'O)mmq&_vvcei\L{7-g; \3Qy~^yf *&LE3Mbu)^97gޖ ps}1Zh+` TT*y ˜[M9)}e:=F]b>.,Ӎyu߶w)zV6|֣ aG]d 2$h v[/Es/U&a ӵl-\Ok:f*SOz{gP%mjݼlC#*ؼl kJaJWIfT8AUl nmg@g0BŅW BįXݼ pը<KpX8^LFA7Y+K^´/E 33_ ؘN>LoyWP=g9+i@9@ ұ|HbJ[ˑJI߬X3Bʝy~Ju܉CV)k;KqSA tcմ&eU~J/S&&ɗ cAr;jPhF+Hw7JiĮ۷ sK#ma s]G^L' X,Y{OY1Vm>Q;J-(^I[1+y$(]x.m<@{O7fuSbm'IWB_iM.ag8bM`c/1:gh1IMIIM9fhclͮzGz gouDp]>Q_x媉.D qjDz4I1:縗ZMNտFLFы^̬ EfzAuM % 1 uגtjS;*U}rh(G} !EWן25SaŁ|pJ"ԶN/q#anE#0Y]D3AƢ3U){vKY!n)zxݲtrΑP'vpMwCjI*GAv5u 0LxL`ȋM|gǸO:k_G6#sᚑs^*=ͥ}NB Fuk|QFă!/%qQbn9m%ͨNR2*L.;;V :t*DhEnHa}6 F(%O^9H)n:1뤤PCsDc3E -!hǕOzxJ_qɅb6V_#G#S/ 5.Ͼ%z AوISGCV4䁢\G*V Yc+.D(#NY+|@>$h"^w ncJJ 'N夌LT 3zsa;ia[Ӊ Bz>m1NȠ cUR#ȡ<u<$(BĕdCZk.&*7Q[x%ER/f|mj@ |1V)Ea]4|ԔK`x$U}QMLJ$W>\p~̎Q_Nr4L;ʯe٥@@1B[G$uqKܰo{jxkxaqO&m3D\.-m{ $s ,|j: G46?=Eo)+ ep8.,F7"ݝ4޳5>aӳ%xXʗۋk4Lk2^J 1UGZi83FEv&7j.}v"oa<}<+[藳!JB/Qu7w3L.+;zXtY6;9^OY= $C{泝qoKgofEѐ_Xk?Fr>_O[Մ77 xw[w~1 p9 p̀@;bL m-^6xʃ 4nV9H95U*(BB;.ԋ;.~(;B`%Fˑa>eCi c\ #*iH:4'QM&MzȢJF1">aiL1p|kF Ʃ~.K{{x,[Io/>D{!rSIwb4P}2u1 uVpЬ1FBЉ%o-(M㄀R󘢋)wgdvQ⺘ wBJm5TT\CRe\ޠ(Gz3(j "4݄H o^Mыdn Nx5\R>xνJ"^RJ!rO9@_OiU3#֒SXאIAsMSI5Lwk 43DK dCiSɼ_{Z4(gfSd*6Ѳ 3(eTOhlyʹL6C`wj` a.3ig{B;)D*E{0Lq|.)*EE" mvN];(FL!N6F˻MQɨ̽hKŦ@A)(o/%9!-?TraLyfJExm;NŦ,;Dk* ^ASw$A@bR4o9+3B ?ne`3[u4E 4dVSCPڠ#Nu &)ۄ値zGEjLBkJl)\;ʮ\)n5,cxzXݬQ b,/ Wl~\=wdEYEr\?\$X}/Ff>0QK tQea2>|浑\fS >dPjx ۖ Gn9W$:ոN(C&MNxwnjq_p(d؇#Yɠi61i*NEHGbV$'ҁkDB6`4xԔRaYA\@-#ř,qӓS}Cř)JloT_{iT'jƉ'0u17B229&6=2^8r^$翛s#bom7fVܽ?CوÆ;Bb{ܐTp k c BuiˌEna pՋ4r]QiCw8NN Z;UQX 24UE$Jdcל ͎_!ȞD6ûWa\!c( |t c$AhAK$|Z ͡1/TO]i4nG-VDX3re<{KO~;+-:Ouf ?۰ͲU.#IՖL/}z_DHU 9bfm}1̘[7aWGޫ:Њ{=)Wq#W 0xMvqf%QU,Ztq}nEɦ%ajQ:΢I閊I}GvL+BPDnɀJ.8䅳OE[T(w ?,RΉNo.ppbG 쳙MCYp)DV^F/O5Fz8zYO3`od(6ªX(^塹,̍#c8=2ZQf.IrU ;4V$ јF(P5hџ1;$Z#P#*< FUe cK\%,nnY#r&Ɨl1Y6X?QȮyLjQDv \DB|eJڊZ<\eji%p Y>:rzp*FW :1N-nmf[?9fU)9sfIw8M*k!: ~loaAun7nzn6tv, DCCߛ~vU|}vAX=oZ3䠍a`$Npʹv,C ɴ#nxؠ2wkhDeulX%_ؽҨ( D>D^$׻ 6 5Pݎg[7wn /,Q9te}6pxb>Z9@q˷bW`݂Bx2]v+5jEKuCnjT^Vw}(׼c(W Uj\}6curUUmF-P%8jItZDt.Rtr$ܿN@Q#\m)E*TX486͖pwK02aWÝ) 8꼋,(JOE$g`1&w"7XH< TZ"gHKrKC5+H勔]+3**R$l4R('Vw"a`oξ9<}2z"UznZ)4F8Õ&(ȁo߭5Vq]p g<<ڍ5;ܤ1tLaP?nC} yUii}FN&LivaU#;rfpT7\U)q{r;iF1Oȴ)B(~ '/5&X&cDc>T -O J6ܖ(m#v)nfȦ:fk1j+6t8AN4FԲJ|ר+` N7ZkǏi:\E3wsQǘԢh; BB\ڱR ^=NYZ1|ʒԂϵ+, Gc,PeQY$lX =jڛ6F^V@BuAi, FBJX?@@p $s4).} 8Oa3 _bC(9?T9WOp:,^kIG6$Jsy\sBpj(xޕ79v>y \~/:nb[Y!^U~o\/yH)& )6)) oݐL*NPWM[ShSmڮP!괳\NȣE"yR@c8;R h褙Qo\| ʅҙ}dM݇|잩tD[>՞e®Cw}wU=؆aMmulT܅#}{''L8u8/eA1^G.ɿf$.Nݝo0G5uT0--Q3+|X+;y](dļboW/p6k0+4q!LQƈ'eɘJI$,z_vMЯRx^Q+j8 GQpߋ!"l QRs?sqs{9R<{/NMٽicDJ^;^%s;XvO|k/VȚnM^"_dRO0w*8yrnʌU>_+lǭ\z3]\~E8:JCfK;PHP;pbp&_[ZjB&K>z|bdS}My5;'>X{0#Q9^#R1^O!DxB+'v>^mnuN$Pg+ѡp#H@:\ Ӑ ^RQ`&0y9o`:m\JgWZȄ?< E9NZ\pP9",< o=Z p ]`yuRKG LiT!@u5z<'a 1Gv){*GqQb#̺kߒ{ۗ P)jhVs.wD 9E(F\D 3 )☓!)3c̖))fZ^:/=;QDjDu!S`A g`P F(P- 9DZ\.pA÷rA*\AUx]N1 .3Y Alx%vn5nʍ9B#$p7V#n4 畴XJ^pJ DWz%Nlj8ZP9 5X\:&=5% u\ASg>4>ݧ$4C@F"0z{(Y"O[9bHA`H1'R;6{~RJOQMa;G 8yQ^ f\Mz˾fqvm[VE1IY`3Pe5`U )?^e`vcZ]>?;ckW"rQb{tI.9؜Mta} 1zu H` JzL/dAadczJUZݸV]Q~yYCՓ~ǔf? VbEo~RY0/7N뻋Vh\OBޯ{slXܗkyU?.ܮX6a,?=#`s9=Y6~9k4{ TgiKڃ70W/' y?|Vg`?N'Nn ˮqC\"׋$ԣG.QRW#Vn8y@u/ʞ"ͣGBbMf83ԃdNʽrǣɫ@eN M -X;tMhqs%$8!{6%=gpX*?`?DɟcB$ B(2"022QOH2.`fj d5x0bQ1uP.(Sqp= #D3DW،? k}nF[$Tsppl)g~MJE 052MݨfWn Z iF:^iWFZ>:k\ S_u CQ5_ ۪sUss4Z Bc"||1+6Jaf uY]b qbԻԥ`eR04+U@ ,TYkf6ϽGs|8o*AAU~d)mlh)A6"VqV⭭=쑳{bY)g'u%Lfm-kP<}~o'$)0d|xf rgn~SkJk׿;4w`5{OBN;Z$Ytc,/J?$neνA`X=eJ\ (V/=a%m gi[>v׭tZHkvNG7f?NOO'kP뀭fK#If,Z9gt(Fܪ—_Kd`w|jr.)/ؼ;|9E O&|*;j|?:ArV%ocǸk)z(%~Zuq*ƮvLZLɿ?7Μv+=F<eI ٬(Jta:; 2Aj%[ږ{ږkr[di޶2շ劧[Gq4:ؖӚ RzIgkQ x8&oRs1L#S6e3ye'G04Q^:Q8$ A*#XBfͅ:IX9 ў5j? Ϗ=s3BښCk>C:ݜJ\,Z a]ͿŔ[ )`SR1G1R'5 "2&]r!qtG6"ҧVU>y5-u<=t:7tn *<CB>s )}v v`5]a(AFͮld\r FpY.9iy~)PTѤc18T֌}fDEcCB>sݐKLpY_y_M }P4rJ40;}YRcaA- `埬u8"HX>;W$V~X\f>}+HvG60yJXm[̌]݆NA(.m (u #E]EC@2 ې(NOfĀD;X'1 ѥk^%1GE:-;_)WX)Ǔ ) F=wLl Ph~ی/oE`|ܦ[!pҕC- qֽĊ$Axd{TA)>p#t2jR^W0Y]Kƌƣ#I]"xGEiO]%$wY*m/`ZB%&Usz^[njХX7>Ї$50 :jmݭeȁM}Fﯟb];#lj ,Y .;Ý J Qtp:Z8j͂䝪^4W)"% rIL*3dHV(JxVw']* I*7LGJ1bBtM59յi%qHJޢ]<^-?s2ѓV`ܡB?' '+)Kl'Gs26bfQ,8TN̺׀uy+LCc:NHS갏##yKJ#FuSNFO7e?!Mχ[OmxR'kaʥ{:^L twY4,2H Aذ[I`BO`.A.fQQl)cGB[jCX?jN29r=N S*qKu>tV:,ꒇ?_VC4(u>hN|<$}xE?>ۛ\&׷7e}z&!$o@`D$qAYxJ:cNJ 3멑0p,SC}1~kMD ޱ ΢oowI X]ɭMnnr+uSReQ':#;gM|+% XXIVl8E=t QxQ"ZyB2CeŠ=L<Pbiϡ>ArgЏ#.6)\ecN˛7sux 73c2H^yFrir׋[V[m4.[ !<,B©ɰ72sZ:P0I1H(,!r29_A߻Wq{'zSt2_q_،yG$(=ԃDqư6S`ʰȲR+:ᾟGW2%cփx(qŠ Lmx Ja X@F z9N84)$B!F$3 7{`S*2 E2D8CRfX8A38LR1/:0Wsa6=HID7f ptReVZJ :H84&5xi53hm`:E%esdWfh@!_Ӷ_wl/7^\n‘͝@pHTu Sj] :@ ur*o_Ze.'B<[)EE0='=[(dڦ҈]O]씈W'CX!)B\ +v9a1Vv44`;rX8#OCOG3+D$Ӫ>KQ狃e"u׵H-6̓-"/;?U49uup|E!>Dw>1>mEW=eo7`¨'R{[. yhJ/y~6BE^h],u^q '1|}̧R ndƃˆ\)zE(+^:c{oO߀w<̽XRߊb?p;0?+a]aVPK^pU4@hy?wf\oA|n9>3oݻỉ#/ͩxlX\`T^~[Xs`|ʪ<~DFwWO|?^~9E|W?2-wP)^~tr?W ":{ s7DDhJ cUB_( 3Q^:/ˢٝ_.KЉ~>i!pr{?5o%w4䪔VĚhP.r,;&w`w(W#* 牆L&&KP F8r7Y{PʨXn_}ic -Ҧ~|]5o0#wtpp|&FM#{cN坌$1@b3/ }0<ˋz1Rb@nwr?q8H)U38;qѩHKС2Y*њS[# (_GKPHW7K1/$DPcorzlz[|ؽsc [%""U IMjL < 5(^ ]5^WyjqٗC|IJ %x,# B)DJ~*o'^s8]-͋ Y~0~$ ǣT e3!e4/$ú+6(9Xinv[GJϰM3Y.$#q.'οBA,P{z'qZȄB\C}'zٞjKŷo.՘vy]x#ތol/^W6z?3l//WyfLY.5Hy%tU$}W馫ny ocNc)YDbu=[*zր$R N[ bC"m.k%(Vǥ֦  K]x=/cJuS,ui(4ֆS%Ht\ Z޻('PJ`)qEǸKIB^ݺ7{e&2nٛ&V3K9}w;rCY7?XgP ŅBsuYITpp=ԥf`zf~7}XlWJȡڭ7=@dh}n~{6Ռ sCVT_r o _?//i*/Y9Բ;-YaW~}..pmMz?1Q>VVn?:KyiGXљ3gkaYfa TqWw%WMԑR]G]E Hj&yu]*Bɮb0VSˊb{e@'xss\ 2.!~9zgw1bmO* _K+= PQ^ە'XA|&;M-}O޵m$EHC8aa'^#N΁nڂc{ WMAFHJh(vկ6Pp}j\I:뱷tVbkΥ!E5O:`Όiؽm/{nsxUJV ,;EdȱҸ:d<ŅVY8#Jjw=æ |$T amPDCkV{p,eag%1sX.$B^0d;υuV 쫗w!L7 Qj* kJ+ ΚfZ;'=+{;\GqNCXK[7 T"jxޣm3(we{]<nzUURE~lBZy ) - A8|V#^BfyMRsDb ; A,Ø9Ɍ+L$r\۝kt\(ޮ\.L {:G4|/ӵg5JsW9@}wѾaq|v͐XkNjmIHuE®éBZJB gAne^{tsfK35՞xUXr4T!\EKmkSMne1ȣ:e(b.?r6VѼ֭ y*ZFd6ИdǺ)&y[Y N1X ƙJedmJ3׺U!\EK']Z7u[Y N1X @|֭4y[UN8w6"V\Icʻ].o 2n͝w@CҩSθm4ҺyQcnJ3׺U!\E7u|:EUd!4wH @*~=HStDB)K8DqG`덇5E!!ؙoy^ Q*)$ dbBuv2o, !" )W~E0R'8" ʱ'H tPp@*ދTk1[a:TiºxoƳe$A{6%%- x!|.7D ]wq7Bh*BQ(cGCDtЂ\\ rM cְ\sl5Cgg!D w)VkJs5A#Fma&CTd%_-J64p8遫n`>L< )c@I8LdmnG?3/9R׊Jb}TQ !o7+|Z_^XnԿ.ay埽۟(v)yPg:Ono^3`_et.Nv֕t+L˯?1_-dp:H8x42BX?&Y0[ mtH1ĥXe$7xW B@)8_Sk]|֢Oo^»jk?iow(_xu<_d!mYrnmRqpsf2--0%bEi,D?v n4m( I 9!|=]Dh;:#?9"nWf)&q`$n3L DL@!Qa[iA6 Sʔn. =pUŽL,qyK9=J Ze'5߃&9X^d>"dzG$K 2L=W?9#!>6(bo- I?YveDʽc#" RX<` gB8YU+̛D0XفYGIY z'k`Q[u޽U˲zž⼋8z],xy) B2e;Ebhla+918-X8v8"fH &vw(>A!v}A!0jo.饦w[5oZsA v E gR_*O@UVЊ"%B^hJiw4Y!(S̽@6lTB ֟Ogxqk6G+\=|eapGX  !1;P jQ b6B)B4abW|=s cji\@S|bydN ᇝP\MPkdV9c'֋x?'$ƛ\>WB܏?i$lK 3Tɉ)9hd;i<`{c2z;;Te QAύᓰ+ agFKp.Gi4g@~6 n&}_bgF "Br5O:١;W07 MOniU>(uvO}^U͗*]D/5Akj."fYļ.dW_Bxf;GwRE0gwC͚]*war}/*'E®"8(O]ɸO+T*(MEE4Z̡&͝&|{[[ >ݳc#;/ :1#a s:^tJ%Ͱ m"sz{M_Ҕ1w7O{^Ҡپ+ТyrJ~xJ˙E+,.bѰpIY 6BMI蔮ԫ-tv#uuz)jZqY DuL ( ~ Di!Z#4~=NU+8\h`iVXY'T ݧK\-GW kŋz֙/+ֆhOo bG)yf+Ql%@Kέ"V q&* N+Y:'պ#&RO)}>^WS.˗U*D?%1 ҲJSrvf4>Naiˠq}xt I̶xH( ٲ5$zVov@crvQ!+̳%;)A}'Dh> G3]hkZh!E8o_]rF}j!㽸RDrk$m[6p׎ )m%g&:Tf5GX#A=RtS9P8~QyHO<8 i|hZx!4IزcF])gL/ǖ\d eyeDBkR>YZq+.~׷6<20XWt^ΰuy)aV94(Fj[q܇Ic6b!pLy)29 nBx3`- 6+I9bN;2Oq8VyC>wϬ[6+8V$25|z[С(T n8Z?Yudv:RkQ:a8|h3 iYFGwA_~Զ׊)ʫӽVo\,=^͔Va.[v;Q`!xtgQpQlb؃g BOFHdq)5Czg*11P[EZ,X[~[ru*5N!=P϶WXpb 3ɟN`LtL;dwDӉ)V"&GtCk^— 'Kd4V0cV]ϼVK:GZ/iR9'0}5`>Lǂo~֓O;˹9,Qdf\,(' AsbVeu@=\쮗X"ՙ,1+lAthqn.O2dK$Y=Y|30(򓮴Sa$Wcܯ梃Bʪݼ雽^laW[0s?ۋJ3$De@9Gw)g"6$F of~ZK7.S/('^QK ,5+Ffqlg”*z,9<Ȍ642o֜t Π<[i^SgvU$,m|aP8ϔ(aL47N 82[`Bg@VKk04:;4 g-*8hljDCdN0/F6GGGSu@YX^2PHj`F=bba.[=4' f!m="! Wx)W{Β2ڭF+HJ&)ƎΦ` 9c)ybfAaT^)#KfUS@o8bq[MuGs1\hp|6GK΍n3w~xP,̋nn (/}gR8GI,1a'{pԝh:H"ew:gAwAPeHc{Df d(0e( LϝV[/!jS}x/L.VZTi !ƥ&V]޵}Ò]AL3h'{!qϭ`QfeU:ځ-?=~Y!m-F"3a%MJ+Lŗ62QKH)D~_K>EW"wv#̿4Iɪ_h8J%QsAXVq̰4`Wb ,4w_~ 3sKakG cԖ=e^)ef:ZEO܂Hd]RBbT~?0Ph?##M O ' #U%=X ]!inRd:v )&㿃r "!jK7ԔQQh2!i밚Ŀ}Dn;>`£(](*TX<-}aca Kw|}#~w8r^xNOilY@,p \[y<' ^A(j;a|2^h\(PA4 oH 3;L ]ih+1 {$0j)m<3 ZOXI/Bp L0Vy^wr ^ 0&eQCIڃ-~ B)CՃudj:u,v>ڵwDŽƕɡ(o?"tS$!3r#SȰpbjo+9U[6|wNЦKnЮind1sR{9H0GԲuUnvv<7$Mx *j^RVB;l+I]tqPI <{]9ys|0HT<3jj.7^" 0Z1z>E>4z8(0iTTICՏt_w V!Td$ ^}r>ܤ! P9GRFotoӑ҅IZU^ ;ׂ[g۲ "Ka5_BgέUT͡˪m a:Z@ 21bS} bH}]sIx*wҥ;BP,-ljշbij˿soO;Iu|j2)7WP0.RDgmDJDue ICs6b( XZj7PJm:(6lo\ȯP L٭MRՇ.q׽Ӣ yC@g SPk7Ē@jiutX$d&dzugw1[^uVwsP% -έ`S@ !緝:C 3e!oϜA##+mA>;`PZd.ޔG/s$YoifMnOwr_M | 1e[34SU/xC[aP'iF{KB¨v;=E=L`#we " Rb" 5ΕQ8b,AP 0Q>r6"| v'$[NsT;K/;&O89&Sv0ޱMY\RǒҴ;rnhauj8C[w0R áͭikh6&-de{' O: Liz~ϲpC0E1iipێvNKfqʦ3l<?7(dwwobLc\`dcqVcD 6 s@8p8h`?&A~rM]ԷatF,vej僵{4oVin!׵a {G{Y1oY?󯿯sC C_x޿[a\FmBlq3F1`g3^O}xi\*?>LnSFff8k7~z=, ؈'gz#ۿn+` ѓ~op q$DDPPi` ^$"xL8O9XkتlVD)d[q+h{ڼ "ڈ>lP'cǣ,yH}Nke7]zo-.ǔ*FZǩ?tDvݻS8Z@2g9`|@+ՈY*-7(UW#(R_!: |s,=Kx\{oRåpV(8yۏ_>+;.m8 $aID.%94‡Ǖ94 1B i?bUv))Ė9? ǹ[9|JJ /e2 rpWj 8@>wJ(妧Bx>yxm;6Hjb}$X\2=$-PӼ5 {ʓTOc"G`~RjKE4~{ِs +vإ 9vPhfn=>J눭Gy` 7 aʼnwHy6E? *z6Vdu{+ ?J ʴ60YbL>2@HjJ)*Ckz+M2ToaPR?/y5~^Ѓq@!0 `0Z7Bv e _}u+?͒yboEϡ0˥d0 G\zKr,e{um[- C3F#"H:\ES_G_Sz/44Mo|"nESr({"iDoq"dAhFV?/{`ԫ&?/5έaͺJP360AVn+eOOq&fykۙ=`_ Wòyʫ%,sq:7N^+A*^ ۨwyJRq.TXN깯_4^KYX dPOv06NdΦbDzԫ%(?"=Hק|Eac phlD򗡨dߒܰ_ҝYC0E3M]RؐK$EK@"]9a%02es^B ݉r2$4О>PxzKiz2Q[RMً?H^-( Ң󩗞oœW2P+ѓa/_*WJ1XF?@>K$&m8[lr0W4A\,%1mK3Ưǥ[s31:ת fݿde|SGW2hf_,!;@XjnQӛQmD/oyxYokoVr'47cd$BRH+\OBFL82di@=Dn…0zT-H]KJ"f2RjTbCޅnz,AQ]K4Bş- 0@^=feYrJq,ʹRy; (Iwq]}qĭw^OcoGV[YMpzvw10sgik/Q.I_ޥ: EQ/r&uсv;)V ޝ(ibRkE>/=PH3vRcm?t}RJ Ԓ{ing Sr˩ߴP)0BR"}-NA0ng:j8[/Lx#/ \a 5Sf  eL0sstd (ߕQz.0R&@oQC"aL4.{Cw߾Fo|Wi_4~zkppLsh0N2)9uo۾=\{` q)P. (}7ڳ^g -MyW*ko?ŅMc9KiRp/ߪbz uۧp$k0~0 Fd%0( __7/ Z/7A,D}I sqK)$wcncz_1eqd3@b{ izYI&߷ZbMS,VŪ>\ٹˍwٷxkNDn_O[KS7~YIJY Ɓ$-SFdQ͢`{W%Mph6| >V@eG.0 =Ip_ݻ? "Csco_ѱG,l?Wxyq!b ̣Gb_H/C1P/+@IYPFWk݇iB)6^]:޴Mo^y, Cm]IVxzib09؆͇p4WeNb/@|$.69?O '08 x6Emj (7|wNEhٛWVSvmbR,5\ |e t]B ZK#ql+߽;tot;V^JV:iz~DC>>m}ͪU.Ԕ{IܽI!8܆x}J[߶:kg'_IUjZqgue,U ޒSVdj[.I5i$_wI#z*+ϼ_#OP%=Oc^yNMVH9o L!UX=@ͽO_JMZ0<@p[ \b5$_`lRC(8M>ɾlO1?^@O6sC IKw5əѽ??ҽ3Z%I7O%5\2y}gڇM6eR+|ZXvB|(#@ıԴ; ~[e3`,Imu2倖_ Z]>~.>T:X%"+M#тTxrR^m*P$  SVЙLvS-6TA:d,jq,zt x~"hU[nRENGG/)2nKı) Yģ K!DKCRIE! *՜5ku,'" 7Rpuu hFGBƥDZKJ  }~AI\9u۾"ub\3|U}-Kl=)#EǨI4w :ιMxt5xwUTD9.\omYp5_;MN`#: ⶾ)bJ^ Fr$~7ۇw( _,n}'%[~{Mpeg {b91hۮ@<73dr7w [i{S>姍| gKlkmr8bhJ/ ʛPrl}cg*92f jB cf]qw P)T,]y1:WS`V՜/udDqJ8S+I'g'NWa@3fgre%xˆ`j΍cygH]&mYɂEY7RЍgҚ?_vMa!ϹKAz{-$;1eȍ*ͳ~dr kJIw]S$Ȫȹ\N:Eܩ)=Sd%ŵ/z4}S]B'JC/TD)9~zЦ ::O9LTP:VLW/.IZӮfRj\уf(5xPDz[+CX4ƘҴ^Szbվ]ۻqq;kq_??mg)ڔ2jm DJVV[i8{]獹eg.HbqeTy|kxQɚ[BEk Tk_8vn)%&7jnLL[?{\)QˆukQ;l/%Ԭ4 *#9`8ԑԵ4#!ʍ!q+\*c 'Ⴕ+=I xKFeYv^ _0~ߜۇ&dm(a;k[oѣǿlP3കs];Ệ._}_?~ߟ@IB\<ۧGG^bo>`~9+({BIc[f GӮ,7gYFvAIS{@Dk-*j,kR GZ w%ڡJL #z64,jRhxjA9 g-#,R`E \ݕOM Ebo.VNٺ"0*ё#u^~ӵp[6S޼nxyLHUo2]z~4 ,XhܥGQ&񎯍DHbs3C UC:+qBƯAYkU~"a2#;Ϯ2\ $)y+A UL1JHsYp5?КGt|1(OOg#(%9@v7pFLY ,W96ףSs֘֘;IBK Yv:/)QjU; OyF 湐33%iN ')k(Lj ŠSz)i<tY2n>4؋8_ ,Ϟ>i|vcʫ$%G$ҩW@LаnS݂9y.ҌW~JW I`wcҐ:%c$ =tqKx}vGQBvA^˳ntsw2 zb{=pN?0Df7s2԰9Q-}Q$(aEA`KȨEr Z^‰*}|ões:i96$ DV. cS `p2jpQp8?j96T( ', л2 dj-ԝ~{$p:wl|9FjBOӻn Rge~]ůA`8N40dũ9 L%dWD|-z5VFQ|=1)E\z ċH! [[F%qz^V سLղb`v!4 pBXז'awH! q2 z T$Z¬d:kk4 $L*CM zlMaİBE[9Jg+UQ*0 Ғ5*t7]NC;[*r&"_?6)0:\n(c 4@!25(B s BY$sΔ|)d#ŵ;AktVj/3 g4DTͳ5Q͑6lex-2ED-+`Fw HxKTuA,Rf!jfYqiJgXbpr6j9Stn͔$8*J˗,8 }]p: 8,'qEXIt&*~|RQypJNZOg9o}#}9P FYkʼFGf,U\(Zs{BKw& R& fjvgj._0`ڢ{PH;DM6>04-9BUWUɭ?Xo+)=q>F*6':ޕudBe&\"l@zlqKVJTsqϩKʺbe].eť,_힪W%Tǐ {"* (@XXvV "5 %54ܺOظ_B ՍZOwlBJ Tޏz+J%&~ K=D{@u> ~~<.rwNՋn{Bwf2~~gߏ[|"pmI9L~v[&"(0עcދquLœY4N0[v҂_'vpwM:w# 襘"Q^Y-;;Cay}ETGWUCzm:txtY0Id 2aJzx|߹8%F>t.:7~bl;iʗ|-}X?wϗ/뛹I^W/1<1t0^k2=NWHP?SiLMWOg[ە"}P YBhw3Rׯ´~ !ZZe&TSvilN)Tծ5=5.YKhbA%ެq1ָ 8D0qT=X7F8V*ĺQ+Hdg֭\BNE[?:,l-ZhP+2!e6%Aal::NGDhlcla{R e)ɶy\Fd Q `Ĕ%\+(ώ!*z1j}NJ%f#)A^W6GlPV*T=f#r8DK0kO3ՓX7\=FM] :s4n"!q֭\BN%d*@S-)!詾\ZL.Aփ9v{"h!'L1'ذ=Ji d4r,Q`!LiGȣG*='`A~G q ScLÅ@y,FI-(eHIQ0)SGGQO.AVpF䃭SKy4r-F=ɊuΜ2m{(ΜES-399q`Q|uEo ( h{,&;-e4ӷ8DZȣS#v`d]h[~YJ932ags7Om͛sP˕-,T*w7ߘ[s5?aIkz% 9[܉.i|ӇK/p-Io?ٶwރI3X`LE0|4jϰH) ,Q`mqs~ EjX+- 0zS]+5ZyPJKv+) #9t%HH;t2u\E:O=RqyPt^w&@,K+n#>{ ttd ꭧn?Jwx- ݝ,k`tvqCdn(߂%&ƃ)=Įh8K +*?_J{Gsns@"z| Nlr6 7wO=?22B2HgN+'&&|k0l|?駇#pG` !`LP1Ky*PC:њSH[`zGPG}sy2R򨍽S:ygq(ClȚkqx:,2œo f PG`y8||E<]Ai>zruP)w@|43^tƸ#Ss>RLp[ o#Y0e)BH,Vx3rC[NW4eg?y0uk9[:HB?EԿp4EݨXF E)A!mbXtT)**fDA$BE˫~ C9EY!I3d %aE5GzJc E fs-Ƹ0"@&C'+ T ǒf& xKv!&6R&\l8f?L >=18u $S{f67V Ι᫕ PpLn|CY03Q35ZDi e4 YGt^?ݵ&=lڱ>k:IĄ|M)/ND(nfv;?:8ԿdBH.y(@T;*6KR{)rCzLKӳ!ڞGiRuU3=9c@-L'PRhi\jU?;,Em9K!1~ T뼑|5 x ;5Avv}c8}$Ac)`]6zՍtg(k%2y=']P^j0Ղ8HG)ca1BQi<܇(bЖ at=ː >t mBVm?mnOu~n[ʠQxe :T*̀]@Y 3peT6{+CDE[J' ^zG3)ѠY'gOΪ_wVu4eP*ar 5ҁ Z)1 X9*;BXq.K'*BsKPhVg:Ëf"Uhgxu5ysHE&1'1Gi%$"10ӆ[gDF>=*,@QDG%IʽULj#Q ԀXA Dx3"Eَ,@1d5KFTy~+lW*8%(E}~\ˁL368To4 @D".dPǻp8ZY#\|ˤHR* d]~a88Z#2 I#pPu%%"p~"*s!uc|vR3,Jjn/ O`& JIJWmzPIh S J^ksXd3ĉWTb gB^$'K/A:|UGse E,:+-55Kwb̵ =Ŏ߮a֞'t5Lǟz/1LfE7Q:lz|o/N-l@mpNrB{ɉ ׃4K|&iQj׌>l.׎ڵZ ,^ğ5ߌx"'eK L >cvD qifDٙ:w娩P辣/9ʩ9W >286l:ɸG SqwN2KerQ#+,nwC?ᲈa'_vAfBQ EY,_/=J:"@d_WI/.Sӝ41͒232WT4?\?Foi%DK$Q/(ήBiw7g Ʃ#lf:n8V1$xF2>JJG=c>ƵӜv^;= Jvzqa2c; 0PCfدf~ qt0tO] Lԇhn'wDZ??,+ Ss枨]Q$xΟ 0Gdbwm?N/疷g9i5fn;+l1-%3Myp?&>Ln?ʫ?o“Kr}$ɺ.׊ -` [KZ3 )Y̿>Dl%~ ;NK&8I{bCWV#fw[Wņ3O%pOgW49v?*IT暲q{Ђn -Z{*еlo"Xd, xc N!1ݯRJ$B{O]1hu`:Fay חO!vg _w?5ŽssGGkO.ߍRF5=%d8kɗk|?|s8tǫ  z krpR&f⯮beo Yr6gŀm;Y-<&QbAƠXWQ!'=,ԇTnv01E. ) Z·WvLVTH O2S}>DH9eS#ԾߤQ5j݁ Q| f7f~V˫-JHZCZat.]xo? 4܌?O/!8-լxjk`}=ϓNFw`Z?|IX% hG,ӊvt) z&NdQ+fpetUFW!i <]1=7^rsbD5UD8 1qg&t٠Kp^a z84 p++wu bQ<;؊⣂gW49v*2"ROb@â:#[6+QorhXAڀh]MB1!xpr a'h\̕BP->]N)Jmq48>F9I wVQRY"sJ(M~/Z<$uĂ1iyR@[ _U]jjyŖoĞVXuF~8c%\U𽖧H;qd-1.j&twSL@`62'Eµ$\3&OkK vD47D|`btM~iFQV9oaUJq*l=arGGKn639?*6A(W٤Q` ; t DB=s-& _kI$9O?R jE4v 7AzTy~.f R@C`5]~t\1`0DGeüs06Qc_<~LC!d|X)z6ftrq=2鹡nYXϢ0፰0 /,(rTH&`L(c*uOP3$eyA@1;7e21LζLV <,f0*Z]86qH^p>)`ƍ2GZl%"Xb-Z,k2YRY+E ֢sDj_^չފb-gW49ɞv?-7V}a-,2 H(H, $TQ)XŰMMR~kX YJ-J)u<`lBHoM+E<3gX*Lg.dMB̈v<+kuYmcٟ֭,τ,gvZyep'XB)nVx-bJS& &4EGNlNhaK&‘o )Ų~$Rkf,>?bX<|i_;'b`U YQ1IDeUWom5{$dp?k0SrĶ/6hO#&79i|!G{x$Pr$Nv(_حF+iBz-`wbӒȎgיxnػu"&볥J\ $ L. `O23' M ,לY#Fj!=EДS{dY)mY;qC]C@*%_cpP(5(վQiw^$Tjb`"@:I--ԑZq̐ AOg͂Y~)X룱 W룣tKs/:wU,j69}῅ƽchܖJ`lՋ{n2:W(=؊So}omd23}^63W?'XZrfr˗==Ykŋkm1* BG#*,\:9:X>UK"D{F/cjyjwa2j9T3`_șTO$ʤT"Nx bb jl8MmZ[7ßVdZ?tq:h|uvWMU~U2N4-l4>qO-EsJWBϛxoQ~b*;d67#PqᕻA5 _;tnx]"Z-ynIBuun}~VW@@#Sŕ,jowmx E`҇G$ }N/]No8p5=bJ1I=gpC# 3=) `SΘ 8~:IIʗ`TomEA(r**- j;߫m/g,ÇN]<%oQАyJZ@TJ9^yOSBgܴͲceɭ;אce<0460ҳ=/ *yO:RP_|ΖhS;qnWP 4<{Rўp5|fXgB$R2 8AqΟmXj.eVrPgNCFftNF]'ЇliF:u%0jZf2u[aҭmG^XITxCAzD Q`Enҁ~*n_t>P>8 7_jV\bm.`fr>Ds;ً'S HXy #[L24J.Ө25WtΏRsƌ9ji]%OADB-e+tW:*5۵ vJrZ)7j8vї%`PAeIi ,aڬ8(%FA  !"s"Pg!J kN6ZO%a;BBl[:P-|]?Hf=*h"'c J9i Ho{3٣t')!`h?"ض(s(y(-]QN$e@XaF1C?8rZܒ^)y-] QCF:wc}TS>@;Z!ǽ8(BVP45`wޕ6r$Bi yan<;g7wԔLR{E(X̺B-7YȈȈ\6 aEN -[=P ċ&H0X2en~7E cٰ_Yo̵ HMV*ZZ,8ZCXBQ)`XMbV6΋8# 8jGE6z,;dq(#+82gpZ?CH3prh1P8G05[ ,LE[ǣZc"-䰑]sX/nu~LARP)Z!N1yq(")bۉ@(k/Yc*낿xܭ{hD ۥEI1ֆTRA0r6۬39|ŸǏ0:Rf#ss)X/6ck3Yl6nbƽSR ZThYpfRB#+RZ)ACNG]!ͱT檭!%EʗT%- ]L86U0(ypQbvk3pVfX.|!]70\b+Ǩ-kJ])IKy-yT_H\M_o]&4;%䳋RЧ84N%%sɓHCDIZPk Lk! g0}g$ Bn7TRTEUX\Guf Yr;mt'3%C{J6\{ #)(BTW=$A%сNTi~Ad9y҉#m窈 u'.^ۂJeźYPvw/E\'C+>qʵdE,y%d8#JD^Q" :$tp]umW En)J)z2zo 'Wq,f杕8-{*­BUyGYǿ$ i ͢hs]>_OǐtZOj`]03aය4_cp5*]l܂Ū K/55骨 pŽWFj>4Û{ oAzfB'h9廫fݼ߬r;Om | CŸ_|X<L~8_l=_'|7x2~S)M|> Fgbh# KiN^yhuCQrf ޣtpGZ'Zinƚ36^X`mw$vPh] FF:<{=D:+aI7;ΏjSue쓗YQNlEX.aDxrn[5wLk`,a}B|lnMjVٜ9>X1z n8pH=sJ,߯ZK4l Û,2ӛ|g5IJb狦սi}W ^giנY{G5ܥy-}XOuzPg{vfh.^̀Ȁӱ,0 闰/<$9z]|8aVr$K;ebMw,#vKO,Q۾EIָ9Ҟ=ֺf{6K  NIGe 1̬tI{ڵN%ͺ/ZU> e!%F?āynOjpOrJ:`xwP3S1OᨉYz334NLMY_v8l,Xލ^=X3WI+ O'MG~ҞAl낵oВmy=HH~QK'eT9H 7~I7~HIvVnWU*qv LZ]4RSa vO.\-& vX1 .]^ )ᥦ`uiFod@Z( D偡AhBm00+*(@&BݶF!8z8-wj%r&a*}mO(DbMVi h[hSE0]ZQ9O"(!h>j\IPLPSXA(E_TQ`[O{?_Pɜʸnqr3.vy{9d%XΓj$OjɓۿC0ɝb5g:Hr?xWd.^^}Ehb* k\ IbRL l|8VCO >Uv7E*q_[h?C+ՄޮD ȿM3wh?_~w_}?aрp4kUP$9*K{JVp"  .?o( \A $VpRgZCXBʒZE$f7h#6!ᙋm"X FwI2)-ghȭ!fquh/u?}Y@aog{RTz#._/~ro{7/g~P/Y+/YC?@\(cF)koG`QsxRkRw.< KYW-C51H G+) 2“X &RņpD Q2`fH(E+C/m%ͺ?</0.[֘V7a@vVF[>rS= n9^7F#UcrBQ 0. `jl#)t5Fiv;\{Ʒ.鵿n g]J`dnKukXϋNy[lR(&A[ƔD^d+Pz6H!Y.$)4K0PaxA$,+K[9h"bF냠Tjdg.w|% ." "*D ]j9l#rāDH *P~SBn~XgzBo ֮ЙԓPuBk~'$gzBct'bLOHv.& .w?[pvv1lb""fE.V|b-.] ŴThP$VG$` %E`mi gkS.]LinbŴS@v1|b.\bHߟo.7oAzNb*WS B` -pgzk3x'XU %Kx$>Uħ1kY`WeQh `KdAjPd{•H `;LgJH;%cgb&1.XSft vs!9/*4=cM0.:djvKn%TX%QOcU!QJQKP%:`l5t25bI5: D4D=8Jп)zqN Lp65or֐Iބ˰"QY)<ϏJso~е9 @GhʁQY%ܟ;0Jg#Kڲ$_c˲&Ϻ'ݬ~tzh9`ySnV͊m6`"NU}" >yfðJTi6`,3:F)~/0*YvT1Հ׹#seG^oʇ!,cxX2Atcg܍ϱ1yi>!0hbF>'nz~͛Ny/ͺ{I&Mg&ɍkEoOhDͤ 7HcoVr荃#y28) 7oJKcHT/Lx:B5ub@/EЏחa/w2u^"ƩxD*]~rݴ&g*0# ~%b!҅f{R&UC5*)._+㌠zb\ү:_Yk:^Y3˕ZSmh|+D0l5ۈ'?k]&ޫ?khˉrX4Ae:0]Ys [7BugM!ɥY N*t&ڬoV_|nzRμXUXm> Wh.MTܔ{W1'po N-ćnúDՖpșhOiCvSC햊AI֝h1kMWrpșhOQ{_v햊AI֝i[9sM)vS'T Nh@S]GS&ޝv!g΢I<%xc;MNz1:c:'4U=v!g΢Ikc^מQm5x́sJzh9Ŝ1_v̗) ,Y+J12$(NVׁÿJ<1/$\.K9[pB|sJB9?` [`L1g* Y \1$$hLWgΆ1o`) .sV1|1f)"cB1<ƘJbsYĘjx1fZ8ƘsI`Řfc1ƜW232ƘsVI f-cgcfnE1fcc9$\1r]1<ƘsHdlx1fcc9$h̆cż1|1fV2IwYĘ9axx1f.cc9$-1sccY%Ac1<ƘsJL / ccy% =11ƜU(#Ë1 ccY%M.cBc+ BŘnc> /,Dcy1^Y11|\(S!#;v>rU.|bvk3p-Vq=|s 1]q?ī)M !zh0~VLg (Bq](y]W@7kcpuu=ˏ?'6, Y ۋkt 5Pe24H%roޙopv g"c9Tzȴ2 kk_JGe3pho6mT(̾~٣,;^ඉ&f%GbS*єy2Ÿm,H!*b"Tl Bk "H$R=q`#:D>]2[G\d]L $ /`gZ۸_aMޡh\ںekwS&qqaRaX"e>8[RhHJ!Wl#Lt >d.jV۲TM5OG(68YHP"DU㇖2kS/^PƅbQrv&e%#/.7_+`\\RyseY$8G(kA~xܿpxwo>((T1| C0Le;I FRŹ8%i!C ]8 rK Ǐ"ypPxꬥ,NJ 0w`< ~*X ڒt1``hOa0^i!7O`"4P`K IsMDZ swbĵr+&ChVaZAm'k5#\j $ VhșWR h OB*$e'āZ*\ Oֽz$/yt/N&o7PT>٩{cz70.SƕAh?+wc~1W(r=[T ZQ>A??`L\G*(Vю}ީ(Ò^'J`(kcJϗ/QlPt{=(< K© 8Ɠl|o-e ;.˕kr0M?n^;EaVη=O0,o$ub>|u7^}c. ;:ntu5.E|Nn7` A9m%?SpB\Bk5_U=)-YphdWoOz߼>*<[<6a[}J*Ag9ANefr[`ZXߖVd-߽OyOlqˆdwcp0C* WϙTTia^[L̷:VROAESP+U=N1ӣ>և\?խ>LkTXqJIՁ_"[KN]$uo2C V!PvHHusC+hF8ZEYWs~ _^GW1 faq/%dN]f> oȸ_8.O[~<@70w#lCA]BQ TNUUP3ң*;Q/mڋŎU3{_:8PFxN$.pE}&1^gћ\_O'Kvr{ &EbeDgs|}-ޮ59ɆCC}%Yq b&}rLأK=ZW2Rt+o/n DL攌_ta%Fط?fl//2fAry$AhG1$Bkbud+t s)!L)B' _co^*xFycb]cԴ}ѳy 5FCۢo.ԋZ$V?smxަ)Վy\'`e6ғBItQH!s偈Ȼ+gBAjt>O5{5PI(1\"0K0&#7a18eйU] R&yfl,FGToܱSkN<ֹ%RR mg;"Lji4s> 74`<0 \JNӖټxJb(g8tb6~'N1Y> cJrsFPUo hנJesZ@Jln_IMH݌ͽm*„!5v'ZKE6蔣p0\1@gifNlC[ʌ900L3Pd+Bb6XʦeJoZvoFQ r ,j7"s)0}U&;tI(!xh2D0%2{94Rd${ܰE8sd9MbAej{hlqiD"B\b0?g%KcBf;s1cszhPN={d7艡N¬\23:s9jwƶT$c|'8):&@As㗱zmZvG!?Φa<3״﮾{u9 矯?z{3|$r›wbܧ[T4HOz&o Ü691;|pxe1|{%\aSPfkkG{]LjiֺFpbSvL'a{0+j&oT7o,RWRfR-4\j +m|dkD7.\jb?dRZ9zZu$lkM;nAu-quQFAKXWnml1zSw3wkA|G#ﶿpIBvZ@:D[#q-I RwkA}G,";nόJn]XWn-lIf6W#9srOgZkt|ef~eqtή{qp`/ 4FU/䧋W/Litq%Xg{oUXլ8`wRְTf%@d\$ gۖMe$B|RA!٠ 5^*h:H#v=j 1C"mZVkܼ4 cٶ+?DlPixSRqDf.! 6S-EI D[1 tsOΟ޹W4]R9 r!8ml BVj>-c؟Obs'r*+'k P= taȟD{"6$V@+ w}NK AMtu\j)R`Pg m.x|.b1M9qV!q9:q MMIq3ɲw tھwctMlDօ|&ʦ>\nr^[; L;yeR C(dBrئ0xlíeemUٟ?Ѐ`n7ٟHى>"JS;27g1̯S_~K_ڳ%iU]+o%!N֗ST8~rmGY,awV[rGi¢:1z$cQ /??.UMZ-:n(LXb"$'NHỲY?*W*6')-KMbM0)֮7UׇGZeD&C4^R+1ZP H\9:}_iwcTvfmkEh۽}HMwhR9Pj; q]{ʩs*YG{ΝP崲cӻ6 :LmD0?c e 1ȕHkIٚw$-MHr0,UT:;I&+T/ںbs֗c4zTnMRО"ZbCo}'jP#1zثa[z> ;"Xv^ЍoWړzvHdYroBr"/%ze4vRYy{6d /`}T_;ۇqЧXW=!E=CC_29dwu]]U]]%ϓoKPdA|K:#XI-vD^tq5Õ=mdvWq _>E, Y.°gsM\(U@ne1Py0pX3ʕT W&9|:W¨\:dU3QQ$ _V5q,t|;ha#%SI~`>$8)KD7<:U 1Jj6.r GazwZ;>3" K/?:`Z2_0dv'.t%ݽu9ĝpi> NmP eƎFpJ)ʟLܔ7^Idq4,TF^qRߎÆ^2^ RJxup"lmtL{ز1FY:N2D =F;J4/}V?1i//a I+& 'm TNZ9!ڈ]FrY!l*a`z pFhJJj%bzd2he6#\BDh aA8^EL;.LB7)pLM4((<0# A8jI"<:E|*ԍR#qElBq1C}\x5!=߄$ M~B[4Ny#8d;LCEKkY * $! mRXzVu`$AђjH@ j^"#(;xaqqaON`T E)?3'뇣ӫu|cLS-;pnW7>n:bۨ`ȑJ1E<^ԚRӊ ZƮ6 pq8ėa!V -`$DQ?vRCe'$=ar*ʹk Rэ"wz'Er [W/$[ZaR#?}%j\3g-E=ר&f~&+«_ g4/+b|wwu5y;,F׈oO_U|1 w67H'CED51\W<ܮ mLBJAh6Gw}[>fm#J^.w;0t溮b*edRBF' j6lyg ׺8V6o$зg'A6u?#9g%ρp7ztlG=\סt[A]ްr-~k6a}"kpKv8슰^K' bl¬ۻ-SÃT{[uUlP>W6Վ9qZT2 VˉN = 1ٻ2PXaa F+r̊ &z-y=WŸ%<zU``(97V6z KWP6d'f8^ a^ g;%5TpI*wV w o.Z^^KY@>%Ar,*`GzS+3#<{?o=PYn:nnk9ժDnpٹ DгfB-&3*Mޣ`> E K!yz L !v/7~_ PFWrVGyPc3&C7"R%E d3 5\ ,ns:28{4/:*ͭZ,({Va1yv-`q ߼|tHp{!w@Π{,y(-3I:P1$Do]T彥.o*I}Aa aT V́7چ  \${ѐX+[X x>凛^%1쭠1;FcQpw2k#Ht0_ՊV$h5 UG#cW;$.?A銨UiG81iyQ>f}ZԦvhĴap)N9mцQ%鈂}/ІBފ xT!+J7iFod" nA^_&e(|܇{J*]24;Pi_񯃩t*##FC(x}_7w|ƞډbS(c/EO {0o2OM]TNi'=cy#t>sµ'B0p*9:筆J&X1+Ń(1*wiܢASѽ s@Ls"ǙΩ{#z>AH*7Gp)ڕ6zSXE}{')Hn4&MWq,2P哂N) F4.U\F.d"$myW7UЛY5~].IJm9hu5szt̖@@G̯X;m4O{Y6?~p1>in@O#IF6\㮍k)B&8z)1^}HDvqJϠr' ib~dF~JycUDo&}eU?rs2E4cB%ڃs\;TEs\{;{;|v[yMyJp:EJiRdkcH&QG+ N~ʫySq>y=ֽZWI׉'oz C Xφn_elLMNfs.8XɎjEhjC dⓅt=d)X=ݯS,=5x?DFAO#Nt#wRAC3NFRޕ3GdN)̓ц5ʏKS<!@PBN:ԐF۱]@6ɯa^}w.rzENh5s/ eޅtDzIA;ȒHL)VCŇ9]'tuZ^%pvNw+WMؓnbm)b\deL:E*ǩPpؐLr̒{t )O -q&u3uwL_hkWp}H zJyDQ)Xd;b[ew >۳Q$]c|n]}?,l]d"[vMˮZ"DB/9S{+!Xu:!4Jd Xvu CٵmZ^ج&iٟb *(qITAH;NrА(A@4AzC,W< B2%`WaA܍=Wt[UZCFQԿϕw\e;SZ&Vuwzӄ2fsJ@;yꝺsmc Bީ6S֚uu:7o RsI{9/lTSè Lˀ(6ƣ8e>hk#:. HhCDlLѹ':>E NDt }8Z$gtR(yKU'DO(C],xLS5F"6P EDuhL0 Q=iJ .ClOim3:HGJm6xz6|ghnHSRM;R.#ܻA-Q2AQ'ᴪ;ʖ6 mczIv`A |: VO;,^A̽K +!1ނA<(BCPR#D[$΃݆%e8m$#Tn̓6R\*M6y`LvTtCiAeΠh94T`.5#3# O"vOk@;\Mٰ|v]ˈnrp-9pOj.5!V - ը0SkL?fRo(6B2#niRWq;bunţzC]E{a QwUΠk; p(BqnW/όW`PCj.8j5 X@y? dvXӦ< ܐ#jdUSRۏ2[,,>lK*m QSNr@'~?ڇ=Z,Hfk7׭kљ<s<1*WN(7CG mpcܛ/l^'hWpϖq?[!j&?D;uf{bOCL|{9߼0Il6uײ5fswu0+>ګ;dj1vhjwRnGv0>EGKQidWxB *frsiqW1_΢uBDwF³Z(!Gpfzv)훟]7_gKB4{t[p0$(5|jsoo<>iݪNgm1tAqeyYXi l@+G#P|ț"踉F۠%e]8R>ǜv`$;"Qlq9}5>Bݷ&8j)p# Xw\k]os`E!b4p ? +&LZ%(3}޵6r#EЗ^|? aؽn l;df߯-Y+j5W,z D̤H:3*d Ө6 )`%q&c1˄u>R1LNbWw.$"r5 Uٌ:ޠvjoX"~oiV{/|L 4xq ~&/epڗg 9>J_ 3t_n'n^ |?Lw4QJ9HJzsUn%;Z1"7yj+,`+xԞ51-*P7 i;W(ҟJcgabe:cԱn=zimۺE3jݚАwQ:{M+ՉusYhFC[UdR 3fDL$< T)3̜Z:nA̙Ԥ)295wyP61L3M9~&lWߌ Hp&EN&|"I) $%6v_a k\/jeXRɝM{b( >ERk5ʜמ*8D iFe#0{>jw5޼_x pܘ̦~,[Ț1S^jfWr+t x{ZT(QVFkS.(߲ݗ{gck"&Kkm Q'>WV=+mJsKq&liJZT-  ,݌a}T"%odw Zo"AoE?*6pֱO)Bo/ \$N cIɇgrb61dY.Հ2&/o$,?G/SXK~k\,3 v6 {&3E"uXꙃy?>Lp{ɂHy~ yj#Sk6X v.B4y&#:ҿfl͐g>d`GJg@I} l(v(}r ᆱ+@,6s!NL"{,S# 4>Jߞ>}\fv%+D˸Rd+N}]y#7+2}{k(cDZ;Є{*>O|pTzH*}C uPNab,va(!(M :| >WQAe Jp-nrՉTCENRÕ"ځ0t\_|ȟ2OE^od?wëuyy\$v⏦.t0| kee"<sCԥ0Q-,KDFd&sLH$1PYC}ծ&-<#`wwlX?n7[&tYb{YbCҶz^n4x9.v !( #x]GQ5r?|b_W8L/ҫJRVwd0a'ø:z2!-F )m ND9g$H<Թ9H8c U2WO@Qzbt8fpۘ!fn+hR (S*NY$AV!`ڤrKP&O غ =TdK%%bfxwNQny#l(l6bėqZGCy38.7 煛b ud{`c8N8YL9_O amO=U|6\\헚Uz94}Y܀YtXL0#k%*^}ˆ}0E6l>Qgp_̽*d'~^`oV|Uf+Y_RK84 ':LZu4jt/2$v< >B-c|,,3i%ݞϋ$m;񭒶u-+b2?& mQLä,jEh=1/؆bo-qk^fɃ5p5ŭrZȒw֢ %):H8cשּ4uqȂLr)֕ӸWz9z9z9z9e5;O#yfe,NC.ôh!.E(u:GbZ.ӸcJte#jW]eL TA"e2[CJ{鎓-2I-#j"JER+| )PfDT`F2|-M2|c1D,zUεc&SdDD Psh ڱ -Ǹ "W-=sa3-Zsl* ZY(NPj%j1)SFC5WSy3-^;TWC{lt+.citlv"qUt"XrSJ!J7#b#;9oup - L@)"+JǛn$rL)L)cuέߝ\#Yφ,Fb0QMw-h"ͦ;ZGHZjՌwm*Q[/2I<0creڗ벎 k,ro,!eSGZ-m-R"T#̞\H+~dbÔ+0aۙ#Z1kQ-H*~-ZD{rL5pyjo[+ ~2~Tݚ@pBU-r;'ri;g$E=$Y&l zPVp@WFҀ뀤1%vӝCe=a=`axv?欁ݢX"6H5֩QS8:P8zH~N΢ aA-ysXbN3[ ;2Ŝֹ\,'$ZnmgYG/v'a'ʮB՛!|@|,wQU WzQ_BE%ރY2bf|R? >-:З75˧N4GšWo!#d*-nyBJ'=Mo Tc0bi~x)(! (0"ۃ?i&=SphUS#I1am> h>L^:#Kζ_qF߹w"Ֆx{G"*^{ٰEX69PYY6h_v-)/g˚粹Y><B0xÌk,4?Yo@ 1\)5J4U+hOh*h|1\dDiB.TWvMIGakݢDa; QwS69Y*qn=n`7lZ=n9tCR qvbAN7wJӝ~2S(ڲᇫxo=9toA_=tf@8BZ];,5x5r5P82~y(M]TMr uEH#~r'UtЕZTG L5A3j r1G7:BIrND6N5([x\6՜Gr BCcǨܟMLy%Ǧx'0w y*SzNZ7E9XXNu[`ZiԶur nMh;W(=&&`be:cԱn}"Ũ3 hݚАw'T!Ն-c?jkyI5Ge+dY^7*tW@vFl7er'eH#`Rޔ IN[zıPt.$ !tDI$ lf:J's,}<>F!AT/GMwOy/RcψCtn) e-M6iMK:˖U*/>gA1`?G?CI#-VWf~sdqUE5"=6Y#;Y~TF`?mjGH/f xfEX0ٝ>ʢ0mSugH390Hw-WFsW1 ")KM3d,GYUZ)¬I\iwR@jLӗAA㝢+iwЫvəw$^LlIQKڋ=mjwP^gQ5nv1cDVRеT;_6!TJ27s"dxu|GS!=E(u]ry)Vi}{/>$+9nдDK9 ]Қ~"Z*u/ m)-I,1\$Kb7d7zDբX٪z0.va~ !b8(Z1:sGH++e99chiз!ӕN/mq~YŽaũ@ڋ1l:)`3[h wT)wH?!sqJoB)|Ʌsn\ 2M24.@UC^( YkĘoL}~N|@u9B2k+XoʎK׹k4( Ps5?5Iß$N{;1vb\7dIT !8&g,0)# ˹YiF~=yEVf؁ uVW?,3yn;F~~ E 1.y'[x~^lz_'AZ3bj7ë_!Z}u&rɍ#вjuq@CN0`[| G±r$ŊD ai4H9yk-Yhf[5\!+|5La  <1ajAz?@weI|50Ee1E9^<򴸦}#X*U۴=jEdްEѐ;l Eouo*yB Tl E6nؚcJ `YBkސDexw|l!YjE4Bܪ@=(0A9#. [Z zQVÈdd\_c=+5BY1ޭT9-0+-.]v(-\ڠJK gZqtr6G 4LVZ$7})-vט&_|/6ȱ&GF.v]Rs:11G<܋[opYe+?b&bTCn[mɥ^*QeIr KqϦU8ԿN./.Z\jeܳ( 7O͌>b7&%XڊT*VPNoن|U ;!{5AR-1P ,yPC/q#/'t 9?kykƬ (P'nmQKm( 3jKROsDAM2Kc{agOw|@]ܥLYCjd/Ϣ$p=ݔh#XW{-)^Czb+;ZØmCВ'{4$]} 1^sOoN1ͩL7K 5'V* DshVmu^WO6y g޼7\;=/=)$ƪH!? u[AC@8﹓I 5}ʋg2k7v{ h[~ҬACtd/-598I1֊XI]d{*iH %'hB]C(glEJju[9!%h$v3MvBVr-~6ޢ?lߔT`fMЫ)tW9F7鸕60E# q_#5Ԃ~c_g.%\DchO #5ySƒ:xy sسgXwMk1l;V B8]qPm|ŪWO-.y=qtoCwVÞ -NPǠ?zZ+7U?P! h@@.&t1k ꌻ03]m8m͡˜ue",y[Bl@C+sNm4?T+s!'x2ZoZ>*;X+KwwsۀjD,LYI X~{K>Qƛ +vԒr+IJm5h4%+f}Dj1a;N,^QO)k#MjcuҌԸT\nNqssh)9Y 9rKGwО#asxչr8'fBv-p(@yW&BC;8'w &o]| fHv7ӞX ҂c]t.o4c:VԌ?kFSQ7%g=tTOh^.?xw~:׹mD~-zK3yd V:nZ;Zdnʮ[gC,yyF>,UCDЅ>ά*=}00h4.Jp1]N?7ff@#mL\6q/=Abv3nU֭Nbzpm[-]/66ₐp\pijeTs밧jэ'N4_W;2lv F5ۡLhb͕zu-lPc[g_UoA`@ _cTXWK[ E81 ˡEG.IՋ(~Wu^y_WWb?)hBѶ0cQ+A;XTQ2hepd#KҰ4ܫ{m_t ϖu9b6_f;{ӛXy7 _bAZ:N9bi0^$0&8X%`C*0DBLT6w^`Qtf-b+=bWK&^g&^g&^g&^WX7 x2G| J Q`%@3I_}Q+7g6FI)O;0F] BQ|Dr>iF`&t(EňS"(" 3HAJնAM (8]C{Wq(Y!\[95^\5H!NStRL)/.3-XR'bT rނ3N3DGHps@X2$[r018DZ1uA@ܷL$f-ur;p̃'&Z?R:C_zD#J>cx79^}5@^By+glh9eRPZ7K)"2!Cu8>+ .NsaU[BZyEǖ m\I Ch/>[x^J%\)c&T^>O߿[>W>0)`~mDn*Rq%P??F#K!d ^gwwL.3RBЭZ>㒠z  ZgPiܓ І#2j,? Go лXdq5`^Sd}qڇLuiGKu4'9TDF]ZzXR?}HL?ջy`1]O35{Kz<עgػMf+O.2&e/ 7O.y78$[r=pUyf@rrz‰6꪿`<)֔c-; { G#,y 5A'T TF^1Nf#Ă3c /Gn,|_:aP)ITkSM""u W u_ߖ^{JL;m0(0KgXnG3Np6 E6j6D]jv~њSp4ʙ̬ R)Cv5MrLhc4D-p'C9+Iȋya@% Lgbr _Z%5S5"K9 ,isO5 Њ=XFQ7\jzФ FkŦ|/7Wvmz^Jb3և}}d z?f?VفJOh h|9Gcgo\39X+!ƿŒ}Pޏ!*KVO-pAI_io4&@0";sZD JkMhPˌL`"+[ʻ SH煋0TCViy;q#iC}H.pU@`d 2K uA)_C&Cl@|duȀ.t =n cɆ]fSv׍.XLzwormzi7 Ԟ9*`Y`9ZUE;iI Y|;6U=7Ynd4P8|5KQ$\U_bꋌE./Ss&ќM6T5s%{>0 1̱XfX"$ᷫsMj]գ_ֹ$0Љ^@۩Bfd$YF!1hP"(aZ+`S ߽Z`ݕh ]}Y9Esb]H@1Yl,g^ظ7鮞J_=Lohg(@G׮Qf rk ϸAlX֮sPCT[P fwdeхMa0lXR B$:a\$jSS[P[P"4(\2HK$zzqvp !G'^?ɣ+H` qOF(٩Lk6evȚ7$q8+D1ĈDˬ2--zF LiN3Abf#ҰG n$a\Mҷ̐DniF)0vEOY  ERdHCD ɲNKƚ0j/ekX 4,V1cEᒈ 4[Y(I =FcHj7~}K:J@JP@Ft_=Oֽ4{yw8_@P2_}!<ݢsԽ>f:UV4Kחol ê+7=vlg/jqz%{ˤvJu>3(ӢﶬY[s4i)!G:6US4pQc`xmZBE_r,o`$U`vޒbR4sGO-G(i؃g!fYS<ԃ's{𬶱<#RH0cP)`x4e |3V:$C͓,UE'S_q|r|[ݸ#\O/~1Ǹ7qǨ]k>'i!us׌ `ΗR [E=9D֧SFO{>ďПOnS. _JZ/o^e^~tuӬӇ~R'iwrw7[U-0Kn\=VEҔ5=qE x ^a R[t -5kP>"')_➂,s*íy SLbnKDCkCaTAVևJ#*RE`INB^m9)QR41DpnZ#nW:*̺eQzM6]-MY[>*Yů/U[T}nI kb"x4*l[ۂt@юkݓ7 m iL0V:8fbS%LyE,-աlE釦[TzӉ2]vT\iݡ7 %M_4h|ijlz>HOӽ :Lt[xa(Oژ#( 7Y- W*H=JB(F3o< Fr[s2wiGo*Y)LzkӧЙ֧$bdB={P(w]M"y~xgՎIMٞo:ޥԕ0bC;ނQ+i<9T^ރJEx"ZtgT ^GE )C] 8O,iWQCoE,c??H8Ftnx:D+EXK=L'2Mzis To5bJ{ EÕ$\u:Ⱥx}A.B nﴶqPkƷvb\F%G,bRR9IaU 9Z^!ۢ"e*sRۂaH{/ 8ƽpDF1P'ޡ. DCW!+D?2͍1-*>Pf SiNIӂ8TYQL`(:]%iXA 䡷Z"SR&mQ.RJ3Q-\ZSUO=F"=# ('1,1"ވ7>SQ=fyo~(2T/>pJ6M#o:ub#ˡefOs(t W5h~$ ry"tx@ݰv}STl͑>pmGԠcR-V6_[> }P9ݕ~(E+ݠ)ІE , !X,ʠ ۲3Cs6qXm`{WafvP  pU$T;qrX&:AX۲3^yf`M_#ޫ:bzbt;-@I쳞jxUig'øv@ح sh5r%G p5!]Fxۡh M,~bTVn)۲7;R-ß `u.1ǽ; $5&jlfp>ZFu}'؃9`egp'Dݟ3v޶qx~f&ss!r"sEqR-Q6|)@]qj P~+a+R ʽSZG:vVãO§F`y+k4$=:DSboqӷ;РdPZu7K`$|3<.ˡh/ke4NRJov^@D• 8%2Ǔ62k[DX BzέbKQlƾ\Z]Jb(x̧OĝZS<&O'y"[=fJTΔHh<NTGہd2%:OOk<<Ū 7e"ĺk1AViOSl"nj MAyLh1UBRp" s:Z40 / e(@ؖ5_-7c ;Z>/'5FG҇턏ƻ>N{mY㛁?6*843eݷ Jr[:u"R|3*)~۵_<}$co4>s5(77/}4<ߟdşoŌKV+&˟O' 4v1Lor.LB'3"E%27g=h[#'y-ҋ1+^\Hy|Rq,Jh*s9~,HӚ R%aCF(I;F3D3" +cy|Ǫ4J! Y;00[cuH&jew?'@shB_= U1)zR\SV zpaG5X sTvGڨ6"wl﯍v֩ ,KY/{Vz#\^BR"jrqxާ)-D%Op{^"ΥqI+JY9 AyuKIGՇ\(5" !4 SVe3Pa{5x!k$$n*=%2Vm)9t7h Ɠm|,qz1B%.qȤ SN%&AabTo1cj;q76o{e2t ycSXe`!AzY{1cBvM4!R> qgFhA1C<}w8DXIɗ{;˕МVnu$UA OH"Z<^x ֱa[85=vqgc^&*XHAP 1,k OOP)( iSWbluz:M)\L\a0XC^ .ۦc̘# *be?ۿ˰bAJiBTDWiiEftkԅ {98r=ݎ 5o6V(O7kc(E)l6-{)rCQaYT"+b-{ fâ,YfEmuZ cy"WN+1g`ooiXS[ZxfuZY"/Tmrv1Qwه0M,4cwrOY=fP'BW]/r jsi+ |Z6=P+pB*um7%kܿڴ |doɳ\S]7w&/aMpYc}5'SD2:W8B\͕+E_zS .O?U!IcVs9J=05?pܸњX7X~ DR]IM).T۹&! zc[MGIF63Q fHG #VZP4%ɍvfʨ; ^؇Ro$ 7ny*SR5<)aT cx-(eW0N#}oDHo.rE.r4#QPDL /9\5t&XEZ LYIc'Mh#a ʅf/Pԁ} 1/\DK-#>rfA]Ψj 2I V] V&͎Em@ppc7 0.\b#Rrؚ ^ eA'yzt{%h1AKE"M"kfd6a-=07"qoֻuɅ \[Dž''+P9G{Nxxv+\xFOv^wQCLOQv &#ҩ9<xJtax4͞1-=(Xcw<51dQ%OQ 5ԦRjZa5e ^a/.6)X(K Ej̮9-v`ג=p36$*pLmAXR$X-LpZTX[&wyRF ޤYBk:>`xwZY@uVȧ6jeI4Rua6C7H-7TPaȰ2w "Z Ƃ|z7% Gqpc D-8A8. StnK")l&QBz9XF_3JrikU݂;>bW P]5> A/VW8}X|Yc`ahqTy-ꠊ @86LAt/uF:'Sȑ)q# bWGՁ*WЁ Ra}7 W3Y&c{& urr6~sb9xj*GĕZU D. D#jՋPPbT'?"INeTtA4n:R {C{QQLiz2Ҵ#Fe"b e@/ ML,ypXsB9$s:Xo\YT`.%P[Ձ*媃ԓZVJ-z++bʖ@p7тIJF].[|%gDV4dT-Ɠ&z}2kTh|ֿ'n&§%2e-ɒH.a\B'ƈd@qdj'";׈y*[?;縊>UHщ!\Th,$b;*i$"q&`\> >L^01b=VO@6$0Qb|VHal_gKлݷgSPM{a{l8&~puw¿2ëlwxW2Цhهvn0g{/_O!]sߝ7/_Mw^/?פ7 w^; Q㻯&L$eԞz>wYݡATw ۭ+1_[W߯ɭ^pwl9%4&-t/ݾ~O*p8[匋ogژ }p(Lo^_mZ>\0xxvIɅ>jѓ:~x*^{tn7 d~i {#ϘzFj-'vUčٵ\jBDTdվBL>v6w v.oǙ~ãXV{ Sw] =(]?{ |78RELknq֦"+޺e)~H jy5A&ф8UR OM q4!&ф8X- r̐||||G[ rP)8{p*xyBN' k-W]7&ֵGY'm-1kcۉml;m'.9OMb*D!BNppPx;j0 0P"2+!lJPu(%'((- \p8KZ\!6ݢtnxaF ʯ*؅5mTn9M[dšTs* TxeBgu2ƍo1Pʍz_%O;Hn;W^ ^Ka *[b \S b+"6d`f `j[vnV;eMp]Z[-M/- jw@Dk[R g [[v<<S{rH4˽cXػ8W%HsΥ?dcS^ !^uh˔2/=TzDR#:;ת @!AƒTY+_>)[*+˨WhRZ?tYz:CK`b1$GC Đc|E|XC;|[4!y-ՈCؠLR{k)K-3Вj)URԲ1Qȷ( ~w%IsR)xeUrf{7a%aJhV<F4_bi6U/߷˂?nTڦu?t?wA$OwvpM*NPZݜ ~AuAs.]'k;ņ8fX]*1}>%-9ffmf>i3{tc3 a3;6+yL YHCH&obM ̚j &fQoc9f6\+ 4t`5US-Nàj>(Jկ;eR#4Hne .qk,L+]0VVߔF4.4بyQ HM"s֥e%_iT=o/oYzq,^>a-[Hpi2Ě!c)uHh4kBV6A[ä,ƂDS3oSeJMiiI;~.KNN[voK#i6>V#k"Q`iƵCߑoEFt;\5seYiU'y|\Cܖҕ)71SU4{4*k7Zh)k9EV!C8-ؒ$ b#_=3LH"~Ĭ\'Jc^tvP5_X-Z (p8|7K !T>Q dhsђCk֘j4w[s՛x?1"1Tb9&ˠ5:{=1YZ"*<Q\Vz+Q]kv^vizc$6M]4uh(Evb2|MS2}l >}&O=@a4Vd K^/rW4HSy w;P-@lk!4gmVIvÐ\^2,E`}T)fNj "Iý>{vO<{wa7-'SCIf5XGs,aL6R-<~U''֕|R_ـ5g(fP[A˚$Z#5*ƨx?l3274<4g?~wpn 'gޙ5.21L]eS|Wl}ԧ2Z`ͷcSukJx ]gC#yPT> ycqpB"7D]ᾝH2j̃x4/S(=R0b֛e_l$`Es瑀ZKa>S;pQhy0+ᏞAicZøe9Lu PĬCNPqLI&Lt"]Il <&qǯI ~3pv"r)ActN1w7Ф|dM1}mMJu}i9D=> ۉ~MVhcqMw6]/=px?e衵_mR^T#O LФ h܏IyeZ2jZs.ô pw,I9c֢&{!J8QZE_,0R:UR3=yv| $ܒ:>=3| Zgΰl)rLhP@;71 Uձ֤҈YӅ\i jB"f'i>*\|zV>n'i\j,DqǠ_H~IK]SuHq\mB;|F%c4hb3άd`LCj!&)tflK3Gѻ,لȠE4&f;aAH[Cfa<s7a^\Mg_^}8\O\z8z ~ﯟ-hsu]_?v32BbsS%kc}_}{|Hcz`d^b:=;xj `5.@q Go?h?ޟ-{+_/z ISoQc7Q{RmmG6cnqcctУns:<nsk2bt5Fs"D3c?"Kl8}6w-ћW4v6 `se C: VUtw:Ja,y}gJF1yohfZ7` .\oO.s*N](,H8 ?;'Cd* lX^wd"ϸ ˉvסKҞ\N}q\ k:CRx$C̘~;3kN8Mck^D }MXQ; M3`twXRSmeAGa6]m#0O¸0/C@qĢg,R۠1E۹(ޱ:M%T4E%,u`h%S/`kՃ3ۢDqbeͨduf>,?w{joKz0hO5GI5U=z|WFNUsrB?o75P A;}wﻧR?jU=/d~o c9i=tj_[0|T7֟]iju(dzTv^Gv>Ej "ᚥ@17xP܎ $ti9wp; S9/D1sxf*йGQ{ x8fbp#F/9Aܺ]߈ 72ϓZڮ>q"" ZIųj[u8c}U鳣&Ȩϡ~`vh' pe!S$4׼0/3w~ԣ"\-OՍ?_]}eӍ/1,l2fZtCorr {5Tk%́#?L?>*yH?޲9jĽ`Xh%IOYhPbMd]dET9ʠ4S5bC5}> {A~dAP<2oMOth,FA'09= '{_/aO5\x;5 ,{0HtnJ 3JdEwۂU@ l\9K4õ`dl.RXs@ Z9Ʋ[RJ?u6(;vS5Dj3@۠p+86NvNfEGoѦChOAO.ޜ!Wn0׳/,> B9 퀱8$DM԰Tؘbh[lhZaBC4f X`*Ga?cUG}A GdޟXi9bz㡿^>:-~h,4fakA)N9.eB*Vj{4Q7m|dU15JyM-d W'Y֔o;u{/㽦P ,D=ډµс[09' Wc@r4{qCkq/{օԥyXd 9ɾd1PT3/3,V\۱ .ER"tN{[Rb% 1 йRX| !s$%֚>f@,[K!~4z) { WL>_gk@y _> /C.*Hhr -F!+Xͬ/XKJ;%YϠm`%̙wzV>׹ĚZ8͡n1%e/i#i{ ڐi cLIU FR,0ڗk23ұ08vВN&՗ₑˇ/m051guPԌMMfl*thV@꠶ĩ!y 8 s#k>=bΗȽ {4)Sh2b%9,D잺xԖ x\&ҟ1j UQHguOAPR HL0R- Z i\ ۦf7漢ݧ ;^%;C.2d@AS"A&C$9q8 WE6+uk*JgH& %$:$t N^Y2%%a:f.";# M(AN2J^$zv7z)ICjY-U' PLs)R}\xr.+^dw$ߊ:LQaV26Y]6$A[$[XP$2)js]yI&YMWȤDЬLj$:Vb$ +db/ mA3S gHeuOxʳsВ_)2vd 02puK42pE)P2LQki":6e9$?VB A9m/1!jj\!:6fjbBHP3*r\F!-v7ҀryU]`'&Cb=XGf婚~Ui@XK- qp!$ۚ\M2$_o1%jHñ22fﭧNY%~)ʙ?Bj($ڡ,ʂB@^mlAh r,`J\ >T0#՟`ρIosXJZd.8u YoVa+zmLN4>؊4 Dbc(dg]Uտ4;i.aVB;rj,$@.TƌOH:/0f 奵`^FCkZ&nHfg< rWۃ5Dґ8bسYVg쨦 ^ wn6EIm< pKWdd㻻i݅O[ J'Q[_t9`GWXg)pi<0.|>FB\`\3c-죘Ǧ M:r2@g=N1s΅,FHF:,[&4a͇IQAdc&|aᑞ&xM'/b8| ͌V"9 2 p<8ElU:GrJ,<*T9Uj?ɉD&^KGɵX׏Zϯ|+(y& Wrmt(|6sQ8s{W]A A<[?~~uW7V(Qk8A z![mMQ\N8L%AvyC!d dmdrR[ڜ&·W|ξ&cϊg;=y I%ѿhk⅙WgŵM¸{I\!z,+i2Zgx9;w۹<9`ATW@5?+y9r ɠ2u! +>hO9_:߇È8ę!X3 LF,iW}\eTdh{RۛҌf ȵa |}]n"R܏>Zŧ:-R7zXEw׷,Aog}}]F h͉R--{6wn>8ZsSf'_6EYk/俶6#,~/=y2g]o;USQB3""4/ݷ٧mK7!jXi3>߿?=Hm'TUHVj\Lpv7XAǯl[Vj/r̈>3=Q-enx^C7痟rt_>2bp2ӻċ'F>8B?M~όF+w E1Ϡu4QjkG =YS7Ҷ6t*xN£quZ߄Wwmĝ$~7}R-jiDK{Wytպ1rqoOTǣͧȔaOdm(#lO%,7##/zvd*&Ī[\l#SxMPE}ZVް5Oz ʭogO-t3&wH+g&wdaKB@Q=ę8tF~zw]/W2^9t{L\6e%Y Yt 9kuEpLt˨˘1:jH{#/:3"+PnӟƤ$ƬxǁI= nK9FƺxW c?Ow*7 }Hn<<kqekD&̞xGc:fjw4caj#ݱ5P;j.y!.|g)ZU;:K'\&N z0g-IMخЦT$ypVsZWH},s)繰&M6l+zTFodz(0ƁiR3U=~l 6/EP/b-F^X^b Y;|7 c{>)+TVQǂ)CoO1cOڼ;AA8R TQ)0tpV5%e$&ŠGd$qf;Qj/5ka"X4춛T*܁8"J %h:W$grqT2fϨJXuQ$} Pfm#V,vIF<>mܰtCyW#gњqw KS˃2'#8ϔ6g[يw,[ BP?IxƥНb2ro pfsPhs /C 9衲N*+}tߪi1 c/ByxY%U=Hdt_Zi*iނd`F.P;:.iB1f;4Xn[Q}E..E 8h22`e &F/!{ٙX`g̓Hr^K[1`zf[z'6N ǍcHwX1?!1pLy7&u'ˑWG\n.ԇOG0hVM}g>$v,_:^ +q@,ǒ׸=IlA~Ɋq}eCbYx̌|@Ku0D^r(L$BO7*N< ukPG]amo ?@ї"ͳ@ SXyeN1Ym_z[fe~EP{Oq9قh>YiKy֑~~ܼ -GUUOL|3^'N~ &NahuSvcaLuNf(toz %v)6rKO~3ʖ7StժU'}wd3~V u"%C0C%3Α݂j%9Hj7yXܙ~20p#̩@w>H !$LWrޅ_ZD}D]jv}ARN prߕgBflf r3@ !zфB]qcz IXq |WYOWde{-Dy=OKadKU;ެWPك)i*k0cWGO3.@B^r3֬sL9Ӄܴ?bPWYRGa{j|D]D;l/=(G^aymf @0%1D}e>PB2WٷGr*qsjCfĽ.'u1#֣u$]uY &G KB>d@jQ"YshR够 E ){{4:O٭լTTw_MS"Ϋ =M?߼=:n ŶYl ŶY-c+Rz?<7ocyF=tYG| ܓZH K~)F$QKPֵ<+m<̝;Pt}ӢaPnRfPٱ9Z"BZ+y#yxA l7,kAXJږoz]% ځY4M2r buz ϛu_LedճdA 8xP)`Qہ0Tp!ƴx5[dV[[BMXH*à縂>Lzcc{ nhspgLf P|ı\K40vl3Jh_M.u}.-yR|N\NyaBJْ^)E7xҝդKM.y51}RG7AG uh]}(S -S-ik7U6k}7p}kj^F8VUp縮2ݍm] I.WߎJ܊: gJXƆBaN*Jhd:ruu te;w xV_8k7!/&V׃& ~|LUP`uBqΪ\]3Oܭ~d;ȧf1@.!6eJ9󭃖/w Ԝ֝WvY`*wE!Ѵ239a>_h% |yH9SGr*θdyn WQ5EI `{RH={Y[$rI/՝kƝC []H1EfE'zK>p`u)2~'zX*B)2h/jQ8=Tǧ5uhDrc#w ,&ٵ{]M:n{W_?vE}RLOmu#,Oĉ A5;6zgqz)p5y P&N^1(Hk,5e _yўvCa)q[hA" ]ɽĊb\'ȌtWĚbSNU-Q^oĻ9q}RbgպH\cT+;!V1|#c%R WTEL*%T|u0*&K!dcX;P上 o<mFN߹fHJAQfp/m+{X8R5i0Nлv 1/-2qVBn{mt оBg ׶Y89ϱxusxE)vkiA@b&Y օpgoE@Z 7ucֹ=>V5!ҖiYt&)Ύ^̘$\yj,YI%XsOPX$FѦࠐ*Q)8L ddc2Yz6L@/V+pfBkqtPwr3Funx ->0Cž1܌UGƹ6#-+_ѓHkB@k$o#@N#•i6"K9lˡPj2JR ]_}Қ&tvPdD=vQCZqk^{lE_ \wPS-{,^"Fo]^Pl~Go!6mA.(?0š%jeW/tQa1Edhn'a.7ԐrkV_3bTJu %r2@eV(BIkuWèܞbo(*.y }<$sxC*7*Teq-MsX|NbĜyN Ď>=4rTLzg_5E%Qh"*0vuǬt?,z^jif7h6u++ڮšbkݐSֵ-DR5::~OXmWDmsZ~@gԺ9ܡZIu+ǴN Eާ]PGW0XZ;p,{<;ᓤЂ{cBkȦ/em 2GKz=R~|îrWt$~ 8?"CK]Uݰk^5W9#׾E}⊂D~(Q`RcD>>pfA4>[fϟ~q嫁 ݋a2=XL&TBősB4vpVS\4;RA{FLonz]?]MMҺnfE>@gĄd] 5a1+{ƱO4.q1Z`.6'O D!#!b. 0\ԥ,Js~>z͠tmڤYlFM.CM؏gܠ&#I%PNo2Wҙ/ʇT"@:z1Aq/VlKô{v4ه{{}c V`bj7T~X`玲}pfnDaTP%Iscl.d<Ư_fx (J=%|"L2AVD() d?Ogzn<||vt^0Gn80 ]0AՉKzvjtIwtk 3|f hcƻ8c";2 t^yO5 y"J4g\OEdHi۞ނ&K.[.@>Ҝ-O>̾@A|!CR7WK ή?^/L.Ԁ+6ࡆu8WpOß\X` Qenr =o!ilJB5582/ K{^0&ӻԱ:=h2af1%>Ng~}?Nm 8eĬ4Kŵm>wGo~_}0_/AF( ϵ} ߾bF#µ'#)GN)|\xxOf8"VOㅗ>7&S<O}0P},3`%$ė~Qjrx)'Sz^\}c#]Kwv VtaO׻iI7b[8uϤW#ꥷQh+ffhL\&7i̷m$O\߄N@TV*0^/څ})v< %|uo~oy& Zsdz/Zi./!$8N&KNkH3ّ2W1t2 al~>(ҳO`'Ls!;N凬e}1㷶{eLϧdzcgNG`Rt{4?6;ӥe^ⳑ_̡#t1͗8{BY_OWڅ?~ȼS^4N_*ت+U }g-)P+FWk`b\(|ϔ.kWa*D O ҥkoXC#-С3ǽK4ٜx_Gj r38I*.U@e7 rߡ߼|vTDGWcʑx:(Bw<^Z~(> `]Xfyn<\8҇_5=!ß݅ ^@_2vTJCK2BI9.mPHj qQ;{%^piy>U.2]DҜQ%H߿}/(2t|b7\ϋ9sQ³7/GJ>8aH r|ɩ #k饒>fp祛pGk0 4 /gGTxǡB jT(tB>2> G?J@&0 d-;y&39y",)(bWk&RŃ B* lH(GPX ǁKEa/")yP.hbe7AR3"3<yP\5Qyҕ&E1)⊺-(&/\:fTt3)xwqlKoJtk{"KL<6W]Ap+h9Rlus:eq_B:ْ(B[nXnܹܰKN%S!5촰Q7P!ܵ6k VA십6jzHH[x_GQ,T\)>9\@2S̸UEwNn G}٪YCغkU'15 v%Ҭ%1ž`𕊶"6!RpCi %pA# IB>^S\>>X m`H,Z<>O`K%k|8g}%] g!̺Qm &pBDV -D["=~"=59zp޸ *0%$[R50[qIE~we2Y;|INBeV\j%p.hCy] oƲ+.־q8Iց ^~fDIEc7#ٙL @BVD2mDrH!HY=1[݁SkN17 q7 Uޯ71xU(IjA ց\h:*a{x ̶JP0<~x gI<|3Ü% LKZ Z`u276›ɸsTQNٟԏ.nTM')V%lKkʏJZ:ւ-r|KW]S10IE=OyVJ?]a_QWmTjt`R14l 51mSo({So`s@ΛsΏ{|?F[Opo|'jZU?I؝콻+ɸ{OjL%i,Ͷ?'5?%)t+^V*u/ X OuҬ)l0 iALܷ|4,(Zt`RJL$Qi & , >1#28oy) s@PصS-#D ,,>$ zӴ)6_j@.6eDhbȏ%m.ohT%Z$`AF*(0DiHqJRQBM< i :Ire?2X(ıIb|[W1EQn t$~BLUAX1L 6c#1!1"QbETZBcY ,"J @ch`O DHъ h1 ) 0EBLHbs8JD@n]= )I$!).#)tMz?W&P%ȶ ֙=@ڍYVz&ʷi|&ʷir*knb:)I8i Jr J͠ J͠3(UnΡZrekgЫtm†{`T * ڀDab4 SQ 0'\zIZ#*dfW> moCh|BCh*A'uXBiy&Ǖ} j P@mQbP'Pl {d +cR*AKI 7A+"KC2 `QEj߫Y+^6QR`3ɂ@^ap`۱@ZAQBqU=%~Ȼ"97H]:W϶]nNn"s)_~jd*7u\ey*%g0G% rUW T _?ua}a#/l(GX/%E!7Zh$:(LȘpa㒆† AAݎEkT(qRѝg ;Jq2`4(;fBY.v̚rh0(A r#2J2Pon ̊#MnLyH8hoMV|u #|ű {ozϭ}Ʉ}LoJ󵐇ZֿRw {t0Ϊl/R׫Xȟ{; Rq` J)ð)Fi7b7r %)_,s +Ka_ϩ^ὍAERD BF2<0 "tb JpȌɪS},ѥ܍UJ-~yc ;7} cRZK8js-!zK'am&W$?,@YVе;]v9;{}>uhOq;Hh|[ѓeVqy9,LGYѿt9z=Od/\9z6W=ӳ_|q7?wg:..ڕօPtEFtn(r Wr߸r. ϻBE] y P>?i#GxuyA_{monŗ#XN:Oi:UO'{iviϤ}캻pm_qlgG{#+d<\nLg۴8z3 \7\:O{?G0WcrdֈZ 3;،eKTB 5`y 730)&m۷Fq]z9+sY<瀦m״q8Iv qLC44K_Hy"x#6NZc4kfutehNsJb;׵g;/6H;t]o4&^.7rq~~x}aӂ%weuJ8rk{ԁ;u\\\nq?8^,lY/9sǫ4ëH?^'qf2T6m+8uVozqhd׾)T ۷_|N/03_ /( זwwgUhY ϒRlwSs?-sK8u5bno\|u}˰=YXY}bRɆWIR^߉Fn<- -KzqXdBv MFOQ65 n7P\䜺;ɜ\}M !tOw)?CF>Z47 O[iA46qQ5I4DS^h'D)_L!1B R\c/+2؆Mk7#ST;ķSDcɪc"ۡjG%l35_%άY|`'+XV"On++XC}n3YSL7fјчV~]/_;'PձV4}G>Iz*̋3t~#3ˆVA2UŸ뇫WUZ z#] o${fj珞5xҩJ4;Y5B1ml>x*5VA^VZU06 َ;!uޝ+Ƽ F &|xoě+oVAJbg_t݉ aa'#7dGe̛X2ސqpoA~[m{S߾!0(jɨSȞ{x% j FͼuT'SM5y!36žAF (It` FvUJr[9Ip*6QZZOT|5)+u!W*qhQ( [^y}UP.ج/ sE3NjAA=@o R֔s2mC9& +^Jgߖ$?zs-.a }@ b"|}L%X<~ -e H'1#:OLcuRسlb~osߟ_) E5t#>o\Uia7=#hFaxY7LY;|Q|Jk+wv3C?81O7=쓉خG`}.jm?l9+wG&qw4::#VVc\a.n'}hdp^ צ~bXQC{E;Gӎw} 4wA Nj/G?_vBb`Z4/m4žGSbE:*뽑Q-MFȺIoF,fgn!l&`Ǡ° R04vG[8&$iD[ ¥_.`/-J./tפoϻz\ackp<+(T>C*JN%W5\~槊yw(+Y9(@:"AXT>h {Tw-9޵bQuWuw{\ߵ,5/ Va\Wq^:C0!\1[j0?(U}pU}FoqFoqF1qF0rz&DBDJ)32_ᇈ I|L(]JKwI,~ᙨ{&D[caom5Rd=Wȕ0 iALWZ) ) 'ƘETR^=ȷWq@^ {Ru(JF>@Pj[8B،OeY{ȑ+?5|I0d3.$%g6l93`e%%+7fbU׳,xʫk4PZ`7./ɋU861={:rhҌ$˞^SI4~Ty}ttn"Si bT| %ĦYfQ]fz3 Dbs{fj?#`ǖGKR6}IaNLn0%o_>{^v7۴6`g9zhV&|[I ڹ$I)LdJ QZE $4Bg IJX%L2|1uڶv pzB@s}OxSCAY8׺6N%5H49mt$${@lj[FE I*,yJ _${x>߻󬍾x?GD݇/c1)ԝcNnoшO6a&ڹ__YRȤ֐AV"ӫSTh+1!盿ɛ7-ɏsKܒ8$?n$_/O+aRĩ4mc ZgV%Br (喁 <;ϧX&dN"H p~}5D_f19 y|kY@V-} 4S<6g_2Z96br~y;T'ewnsV>LQ$ŻW ?Z8y:% ˙ vHZ=`޻D$7Kҡj. g+P+.A-#IXUB$ᢞ\gӗ=>pyGu`<ބT:.>)S2_Tg>u Xƞ̭[@D&6瑯H%4ٻ򪺼V}ʤrՅWnx~5_2[ghtg9t}9?Hݔ`;ZJV >n:w/?|~9kg|8}){]/dbGM٣7cwj,#ߜ_bO08;ӧ~~j&z~=:1n>SI.aFew}.vK\;4$,:5 ktp|5^6/_+:$Gzt6_nkY90_jgF=mR<Ǘ3?:v9T'D_ VFzk :T F%٠bW!;lQwь:!~!Շlf@#X&*8&F+ZTRäNiJUp.Gl4„*G׎RΙ Ӝ: R>d (6W?]U?6_0.B6#hӄn¹˨+Lׁ7M.`b;TX/c䌫 L񣯳u]BPX!Xd5j*9f$Q,gF|X : \*"&Qoln2ZeDj+ d 7-!"[A9Կsdef~}^ Q"v&ӋVpa%ڇ+yXgzàԆ4 x8kv?oe#ٴztm'I _Dw5޼|ucƧ(+ 3 س77q^)ay+<t|NjG0ngGo 䃓˫_7dxinAcTTޅB$7TH^y.E$& ?zcuWS U=a3Q5gw|z;UZ!kILXN |;1ySޢkuB~ +D,[IL,"gp^ǃ6)ϘӐHu-SZ [v F^(Q$حu_bЈ@ ҊDU6d)&m4/Vd2^Dz.L;ڈAs o6r5YisRRl>B=Xv\ØJb 7~P"*tncRh)dv"^%HYk=TH!8Q fP64! => yN$^Y\ŧH.$ʴHUd3}ywU7oxl .ޙ`k>aK|KIP9fEKjQ6:DlbZ^ z:SgTD:T=Grc𘨗 ioq7 ,gjg9%d;s'G*Fdb\THL%qʈo:3J(# :-O>Ƞ/fR:ZoJK-u9 {F] C.$o|ە"Bj%jy1xuԈoo47@" D 8Gޥ2GJq ?ΚDF!PϬB@4":V*\̢q>Q\(ӛ{VZrF攀=f L=i o[ 5̊;P9?pTs{d ^T#(+Q1 h(HY$wTNd #-ǭ.NSAĂaozplizJ̾^4=eDjViOENݑ!sM@i\sNPa7ݠkPp"S.SA?wsQZtBg3:W?7eun@{w',\8xw4[M{QJy0 yB[l8O΢xWM=6%3}&M htC,5O BE64\ ƮZ)\WR2Bg NN֑,! -臧OPIH4:΃\ۡ{q}ўM&VW9#|e)l\ub}ɩR5LiGRh89;Aiu JD~SO'+@eRV\/viG;H`LR<hu Zru;zu0J< 7eʚ8_AeT݇"0kk&6v&O0l>Y8@FwضBpTg}>憐e)ggz"*H2[eV(Yv㥾3ʶӃIH( o,Ɗ `KUY3^0ǯWmb2/V  zS2q,W<])ѯ3DHiK4;Œ͔ecEZJ=_fh2*k֧ʚ} *#&iLgSjqh3ΛQD~0 *[{{<7uˇaCwYOȫVmgتAў0sۿGy*^ ZACQ1j??X>-K`??ô>@n0 WqSY:r[n1uN+,J~~$hGg.g f6![@JJJ*v12& 2Ei4$MsUj\C$Z+<3'|>&ooG*)iz;Z26Di79ycLaBP&]IuGf_q@ZPT~&-\yqJfB3xCjUl_~0ERM[BEATA^̏GcK\tRUGnPf|)T x8Nu)UZL,)G4+*Պ#ۀdzayLm(cHLGN2j1aC4:SSf}Ȇc7uɊ!l$;a \洞xi3fmX08}Lf}ͩU P 2x!xƹCy/5b̳[hR'^QX4*8,UJ¦,mdHis!H41p)lbWpO-w6:?@G9;'; @1-c+!R({7/_6}WHJ7␐}%3-[dslULjb$e;ZǏc&H.=!kGsi7ɣg.Gbbe'ǘFiݦPԤǕ|"hߋ"܍&8|-oEQރr>C0KdqH&m^=Աkyux8̾,8vRnh/)oe:݋ݖ%` m\weI .y‘7puԫr\9 Q)Cj%;o9<\p]jڙx0*ȿ ifbn]>ODHTo+3 gܙNsX{LvcW$37gmUZS:^^1E 扬3qt& k3f5˴r*5ұFa9UxTpQݪ }Ĥ»կݨŔϚ91g:fjfgcU!}s㍖/y3'AS-x93c&) ƙ0IYux6G>o=rwIsAϗ O2z3A9c+mD(錊'u_gqB"Wލz9~"i̯П) ӮA5ƴ뤏AF[}7hssp"SU xƽr#>xsO 1)8#OJ<*?]. as[dO&SI)%Ԓ[uXC= ƃxfZ+1Ỵ9¥Ds(I$KHIbRSTQ.C%OIhM KSvJŋdҚ]F0^ҎJvUuU]Fsbm-q1F̉t hZ18hdk}P.m~s rp$KrUӀt3kig:&m٤ vYђr"|?"ǹ x=b-DKr \Bt4Ku 嫍 J9j>5siqkmP~r n'|8߿]T!o_>0ˇ܏?7=|f˚kt6_0dKw&f6U->-11!c inE}[%Urj `{ kY-mC#ob,0';6#Y3M6ߑ`v#xOk$#;Aw,aYB2цŠ4.JIiʩJߣ &Zf2ȏd4werKf.f?kr)og /FXC>b]-D".e9ȞcxČ"31 @T! [7W 0{'8kBɊ5p[Aic W"^'uWߗn;rQZЊA "b^6H \V@IleɂCD$x: aL^P&%j|G1` N`HNj*ڪr6 9bMLVhu7doMU -֊q]QBRQ= )[{ ;]Ҧ)X`Fn%R/7בkx2݀ۈZ>En{4~)(NJ3CiMV+iySI(D o=A2`cUI')G)9J(#f%m o堚 ED I0SҧUUҦ`g XJwH[A/-@H:@Ҏ %ْ) H_VJ12HD&J0ӈXږV1p)m_U=]kvdE"Jgup]R!g #Q(vXFUi*DKPW$sIfՖyW84 ߊ򦌏<֟b6vlޜh m L ZW!PrOpg0o~> ?ۇyx"i냋DzaERu9M*(}<|oϣb>ق{ba g'?}tfVGWtQ0i)߽:C'c|(r^\˥;c/-.^P\Zc;H!yѽ!8&8aOEۥӹd[Ϻf&pNҽ\30Z:Í ʓu\Se\QyPi"^"]0^݄u7 _c v6;d'~:]0},s \ CpsS%Q'd:hp0.դ$s˥]'ZWI,癑ւ[dLB`f;1,YuRqչp^^'b]/]aI:;vD?>t {;;PR>Q]؀F5^))B)11!3,,,!k}PM>nz\ \RT1l{8!ɕnD\'`qa;e*2E4>*׳l$QR\Q3lpq!?γ[WjGV+\Nf 6}Ouj`LN2ugM&(:V vH,@<4*4pf8yox)NMU\aOq6i@O̷*=ɖ= ._%tv}o,B̄ S"1 ^d, ~w1D!&_0?mF @ 1Lی:0 QӸWxRSUʎcT>vUx,ΛdѺ(iVi%PH.p_lda'aZ8"aor!.Fd?lTHʱIJI=3 H3տztUwWUǻ./&X3C_RX`+#!X%c~ŸM 'kb3Bz<(wMfQ-:ËقeiAT#:%A-=_% 7ƀ (ͥ)J2i(H9dLPD ^N :nhrp3 ^;'f逵"TtTV("oi0%qL4T[j P$z~8/& KS$F"Uھ5vn][cEZm86 @:QTpTL͒Kqe  w~.|",9L GZ+1Lgc9E .RpjదwNT?#M4G2N:aٍ BjΪr FzW2+|y||W)tKG|!QRF,m,~<1Gs$@9XFG?NL薷Oo]R )E \%JEziJ EբGb]d!e\m$EdžRM #G 8·N1mȨ]_O|8/`)DrX[[¶'S UlMښ1k<*ɢptj S0@EU&Xns0:BslOxxu>-," 5}@neA\acdž)JahMQzAፑpe(#*’SD[˜S^Hl\@c0ZhO#ӔIAdsiKzɩD/;ғ>5,FLt*mjXmPnG2nWC-#|0VkIm е/J-Eu dZ"` 19$i%uJݳIS~{RDDIPlqJo1L נVQԀaƂmae(< 4dpr.lJv¬uG ilbć8NR`wG RF0,K;9h*~]',$tź:X U`q<OH:|5ᇺFk E!oEo~~ S/Gl%dsDTd0hA- #%83&w|啤=J/rM]j{t-P%iBߤI-nl>ETWx,b},ԛY3 s{wgQUܜT.'>0rYS% EHn=z@֍ `J1w4njO+,n͵nmHW.˔|ݕy\-H:$=w'ii5_U:9(vfGZ8J#,aN'pkO_}4WWկ* V.<*|nSyB^vJ3gC0眊a7~s^m՚5|[#E^ 8Рk;1`ȅ=ޡ8HdwS\ݪȫU5xM:m]hBwp=Ⱦ;3ֽ kCQm{ *> ƽ%+mꉐ{NO3hNsܑ'pN{3"kA6Dr\% rix K5'Q?hßm|)5x;B{L%?-o¼ƀ+f 8 5퐵9[Vf!ƨj͵0 6 }X-f 4 ꞅmU{wli7ڇsśdm]}NQ8*H=+ѽ;8G9Y])NFebJ~K&uv zsN!0}u N p)$VSN!PC#iz)8 վx[HtMNѲupz8J\e(.c"zQ_Q(ԑZ&s/8SPbqX$Vjq)ЫiP"㠖H?\D=VCyS09̐jO3k":OĽetJS͌ K2{cB[E"Y%M !)]x3(-.5%i[ cQCxAM`&<@V#yƣ Q1At s"5N_=8اFG[;'f6*=5[{,K!JQ7 Z x<)ƨەxvO#`}{VL&]J]XdD8vSב˺7\#i < 0EXHHS!duWpy% 6Bq&*Lug Nrq%H.*<&s+Ε<\xS 9nV=%KBp/@Nja$8wm>E0/ÓstDZ0gd Ć,~&@ɗOnB#גL$#XYp?rjd ,/4"7 ZhHa+)V4DR8c}wF3t>쏫8MƗif!^Kk"ؾUXV;j ~w{q y>_̧ȪN=Jcв(j[^VcÅz8O]5*[tAfr)li݀~ S!:z>+N~)Ht6_wwmYaWc.}rsW592[iϣg 8+7AD)$bR?܍4m!cgE.;|{x=$U@L!gO'F$_aꮄ@Jd,fJK.dfoyֺ; NJ,?LJWb[#gFuhmO4Ҽ; 6o-'a;'j% 4䴻"PaVtv$us'_Za.<͑}C5xW& me6"Ťs.*;8w=)n≠enjjGK Kѽsq;m/5(Lt^pwj'kzg`Oqn9vF}jUۅ̆^An6=Q]8rH|ڌw2l9YHEfmR5LhB̡ ta)-dNekSP*7;E@tcSLTyFmb>9nvWl^`8'10֦9ᬋD)18\p乃6a6Rd ų+z`G2|yF5jP\o߼9=M.-țBt2JZtӉ)~LƌgNV}?Ӽ"=_LFL ŚYXͲ NcDMY^B w? 8}*[XE,ʫG*%7cёERE9vk(cźt maW6KʪX%-@vsBb5*4`RFi*[XT<ڢu5CB~AE,=oLzrGvi&&],g-)R% 0.cHݐ~"Jlף=k|'UnPζp 00f5@ATLu܆aׄj]U'ݢ: ƇY fkDVujXB,7 A-)ZnZ]w7|hknu@u:Aw)sy<ŹINnN&67)>}ȅk<1q0 1w#ӻ7:ѽw(zg0Qtr _v<&l7>CC D9I]| SދGs'i 䐔 M4IvUWU?f(B2Hl Ȗ pzUkEkoFȆs ߮r:^"xZci|4 '>)AZXtixbq0w۝5v_uRdž>NO1[GṬ ͟F}:/vY>LwWjco%վp!&-8e-|pv]`o̎okFؗ`Fa[Gd l'T.G]Kn 5Z<.JkR¤gIpN HQw?Mi!?Xƺ+>T양PÛrH0R^*dqW;WO$i-9 ‡5H8,ԬB}4& *I-Dˁ< l@0'٩RUEd,Tlh5f L`% #& R BQhHYuD^lDn4#aUr90#'T>m&-;Nb#br_yQߢnە[쓺 O |R6P,%hwTlҶ2Θ\;dC\cV#`zn'XIXBbQiJ, bЙbZU` #@.rK~A<T2jI\ FKұƭMASmp Rg$%,\9?SP@xSJ`0-i\8ԛ/Vַś/.w O! &ۮje5/\lAv:TuAj5M>&~|*ǻwgw,^yӛY3{O2Ǻ0ξ'9}u}}/vW!qh.N/$b! lOz,9 (&6h_.Edc+\`PNh, O?_Oh$qbB H79>_+Э,B''U#%yoٴ^j@ӢOڭP鑽bոLwC)N},( MjSyAaKVm.~X~&ߧ˖ ww)]|ˡh3-O%Fr4\nׅXgrǴi)DǛMsmz{u{\H=VqDF!hi]_p[ciqUٯZ_:פ'iW̔J~H_]ӻ< NFOց ʱ. Y}t@v Hܴ.ߪe4Vz`H4%}#놽RVkztO2'!1 [cd@F#jM {b3%XMrrfeWV Nޅ  gIR%(<fQ"ʔ Z XT*X0 rII;a)@l(kR`"6<6'R#ƈ FF!@lGYi$7.ҭSB baLH. #T)lM9zYҥty b n/W>2pS?+('<\Ij lAXpo&SÚE vNΧϷc~a-ilUa`Tk}d!ɞ)NO+Pr>ݦ8㦹 #dSWiU5SkPpy[Ni+Jk`+RϣV&eʳFyMFT#f }^mlуs#$<&sx# =ym 89<wFΫ+Ɠ<%F;!eILipxG xQ-ZE&%RޡڒǤHK pp;)&aj{BPJg 0}tfT4jx kޘpAN8{pt»G Sú GNA(%z=>>FX98w7g6 LۻMUѽkp cmR$ <2EnTv@=9 hHDgbE]Ң4: ^z`pa4%JQtkgR40.v ]2),2dX#>)wءlS@)z)CI.;mPٶActhO UF纃9ͺ.ꇟV~2^ߢu2>7`=}9ݑ]EXO &}=A9>Zi&Yxǟ*[/oZҸܪx=!7z1sZUlw$R=Z gM=܋CRByp^j?5={eY";&ĆH_cW^D S?A|^zWj>(;3]Ū#줖^uOE-(<ψꜙpB'eӥNפ2I_R4$sLnƺPU"8Giu@Xhj͐u9& o7Sݬs~;BzsddHy7UDFR4z󊢟Hh)"<+H6 H=q՛s [/ơB8fKj%!Dn >Z wS0ww<[fӶ-%?ꎔc͵p!2\WUV m×zi.LSK\:\gv+АfmBJI'/|,T9VnCo U6xLK\ hPlhw2I/6)k4eyɋq5 PIx nb6Iଶ6V *Ȗ'(i$:?Qf8"2+Ju`EG)d;H{'\_Yh;ې;6^'+aڰVC~\T^{PFT>.dj7Bg[^YH6OtUy6cQӬ'ݍOΓO]+Oί!wwL9sgNyuF:>|J#zL+!ЉgR+sʾh*vNT0*L_캫U=&t4h1PF7F3cdǶӱ|sFpm5mjF:ۚ?tםI]hEhΣK+ 5}f(:i^Qaͤ>q[ ']yCUGQK޳Z]qUZlyԓTš3["Xq&1 (۠CJ5 S@,e%$4%%=ֈ0u &jW$q{ R(' $9-[v24rf3.Bme&ghUZј?5)-}3E}%;_TWЙK_Ԩ;)@ g4Lg/n>j~ҩ_cO.-wxan\"Y(m2)((,xO+ 煑ΘUZë/YįKS ep|WFK)-$ O](bL^'d)8 'sa9o8ia0=:BU? mil!@fZ ;@GtЁ#V1%=A`DiPKRUYhUttV4,1Z srP7uf[a9zJR39|ʖo,O/5byoǢ;*x9>߇Q wsYI7 ȋ1A^d=xeg?0o3ү?R@nc+BM;88_Qz87η"c8_ZBNa"{Zy\{x{>nf|Z?[ٵ\̳rvl]Gbw桠VmL{xrf=;=7@VIun9'#!F>ӟa0*dc~L8Xvct#|4H8HL'4FU8c: @ϡz@b #g'ж-gv2l-}Lnk'23k]։iB8;7фǻPDc:Z}{舸.>?ys==5dc|6ؒ09ÿ<oe2Ƭ!W4r~_&RYT,!aW4ɍe$LOg&Oa~d=A>7 M7._dcgu7AtZsx]JYOfC464: _?=(vS뫘iZ]Z䄍zͼ}|Zy~|y-,޳,8 v0!"xsN =1ی C2> A=91!;XYq}eOtK!+L_k; r<%39`7il)q#7߫=O:hp6d OJ+3xw~˟V )hU`{#Qi$B}5"b9c,#b fXZr{~B ̟#=P1k6o_6//+iK=Ҽy̰4bse{8WPLȚͰ??@~h"C-c7e*(;_cR*b5FE:#JvEڋ1wEjE^J,(OH|hJsiI B .\?~ܝ w9F ̉ˉo-i9[o ?(Fb ! E9qtR$fN!' JZY;XGB%FDI^* 灂-e;PYeb 0|ONhaƩũ,e>5{t[H,*1K|B畎<,*Fkm,#탱 ^UWJm(=& xyTP3hBb!z) yQIE)3X\ hC1[_%U 4ti] +Zd1T*Rc8\nK*hF\!YxeVڇo/Ls/= Uʠ^| ZKl=SxƯ~Ø>oߢ6VJtxf/:IIgڴv8zi+|#`"]UTU@[=C`[ΐ;6U3t^]ZИ^_|^w`!+B9) 6 bwK!c,P39/^5{238`@x0_9کK oU@΂Eě}IVi#xI*}K$ B.y=vB71 Fq4vq_W 3,h}LB;e=w2U'IT*`) H 6VO2u9C;9ָٙgGydͳiYDp=䬜2n>`gG5i]sqfpN?OkkAsJnd̅~8v,4|n"0ݣX;-<кɀ(%oq!dRTQ=Z6YD`'OL9xjz"WrBsU+6e_3oyeKmjAGN:aڤ~젲父1Tiyʬ֟-us\[IGS.  x%Rbl@m10[Fk /K4BoG@SAeTxʘH犖v>.I7V =5d:#r)̊8!y&w˜T3SdHN^:`*thK9Mv5){fw-FrQ蘢2Qep!F 9͌:99YBJU :BӸ]\M0T[3vq7hm:g P=u㮗lG/錖ו'P,OΧP[ gP!ak)zS.7KM%4EXiRpъ+IcU2_e]:j-VB!lS@L-HD=j[S;ۢ/]τ~߱6A&[`2w9oWAM4Z~vyPm.xIzuo턐)(@,ؔpQxV#F;zlѪ7g˗(eR FFe5: BK 꽃&!:!K~XNѶd6x-ؔmӹ#TWֹʑHo-5: C{[zՓeZ0,D2V&]b;# JnUP|PrÇ qrzg尊5 lJa1l- 2MU:)OIk*A+k?LT~H)7H V "Ͳ&ҬF3ocrU^X :LJ:U;> # سu8ވXJl:V1#Edc)9YP1$by$E>&PsDX!R"h0KkҦcb\A_ȝ jۮr/+* eBTI;gkܕ*(*} )6JG:n9rp(Q'$S(X%GY2G~=$$$GR8}Qmk%.2jr* @^{OdHD[%Ku[7>>> uy{C{)̊yn8Vo]@c [4Cu5J)ZButii;R`YO^N‹בu,!5WK/@Y{=|GO@@- ih=ԩ(8'>[|kj=`J?<|Oqe9C]_nC]y 1ƪap>d]q_ =T+饳us.frIeB˻>|Y5‘ƨxW}>"w^🮯XlF>.>vz܆>sT+G q`+>kޕ47n,鿢e" Hdmݞ8p?ӛP`M맊 Y@-[8d}Y{n?|Ook'm! 50>?ʦ͏si;](δ 3(n:F0Q=j cvҝoL!yGiPP*%T(mAJ!(chZgT)Qb4!&!Âj >?|?Nyz,gIkE<b I}$4P+{%YrWސL'K9i9t\F1D=O#Y>h~)JX@F F/U-D>sC_? xgjX uk#ݏIo`ʵ)XP M7\Rx7_VjPOы4)f&U1bN;.v ~MѾrcg3P7Inju%o-Lw9FDxiL#`-YSlK{W*eE,eQy2L[*#fSCIaQDłXSFu㑙:|ț.`pԧnW+ݖȧI8/Wͪ6^tk:UEJZJ ^+t P(aT[ EִwX~ݚ60*aV)xt:O]ͩzxЧKEKk]mB0_G3z/5>P8jPH@IgPH޲wQlЈ=I3=/4GEĀTwÞ{f胣 Xpv[stttKc#Ie}pfUw Tܙ*z@}Ҭ/%8SWRV'g1RcRѬR<~-zҬ뇜&ͺ~Vf ͯ|av ܏JG1d%~e[l.]~,TOGAg133^3&(8 5i% 83V~Zj맣_dķՔݣjV-;`3β!Dͦ"fG%T Kɗ ӫ0/M[PJi*i"q(UJeNU&eAB 4vGwp+mM<-׹WSKx{]՜W紸ڠ3ېsM)>8)EET1VXj-YQ)H`-p!XNi;+QK W8Z&fZsL*J*2eIR.BT"%Dzbe.\èi[-F!:aqp@mm*xUGQ YcU/!Si9yu.\LHTz}{:MY%Ǚ "B-P!i4+ΥB ̵iCIA!A:mp:\#Wg AЮ|--rCBji%qŵnR!@EW JHU( q)/Ed 03DNI5s ^cPD#9va\?\jH$rks_MiNҟ $Z9 Yŵ4$(byu.# ܚZ(wԴ ^6-[UY<}UւC?:_o%JϯŊ` XC_Oӳ佱>'2קqXn$/wQ)_}?1 siB$],{3V~d?gj}}5۠te>:,Ϙ+~[x^ʵ5ˁ,y-{ŕ(sPœ%*EJ[,yFCUFHNAL(@# d)gvE )Bm+d*zk)O~ xô)^ㄾ C)}5#$k: S kՁӦdNq ƗJ< S,wӯGV2>,^Z,R[5J[X^屻Z0ufnكƘVVm/3"6顳] Wx:Aoضzzcwn2tnض"W=Gm/yXZfKpʌ+GTQ(ԟ:B逯SaD|^S!c7f];\ &K2}$OL/tR;m6xe󂚊e i}i xG1dEJ7dI }յ[LFA-ydZpT1wpGyhtF6pk$P`FC(NJ2~^}X,s}jCAaÆDAZ MPP~mi%JRyE */NpKV rVBXJhuia5ΫsY0 ۞t0mz]lX'*\'>EÑ] `](fe:1Xj127}씷_4rtjȍpEIr]՚N/ wO:5x>FM?zK,'-z_Jj4ۗ^<+$/>AKyz_&(03 黣~8}t7bjOu4cziUץοR.|npcI (p`浂k?0ڌBx}b­S}Vʐ>rbƹllR˓9D$Nq>Jwwֻz'Һ{|N$sBP)%5}>\MilJ%rc?ĹWTP1߆ELn[RΒ?HSx(1vBh?6Hzܵ>fwKp~]4\Š+Y.J~ק"L\\hr>fkG`ߟ1K<>QﯕDB}ń1Y:*WnqFt нݝh io ҅D*Ȥ7}WE y μ[K¡}xeb`>ԂEwƁ4EI)W2(M{j9N Q6_hoUKB<)y%gElL[*.)SV$f\$TUR} M_v,j̰ywaXqcVqiωi>h`%v_rGrJ4p.EDOX5J"+v1Fǎtjǟ.g+>:V\ fyR6>0iE*J`hIESDP3DС |!R2mF> =k>¾3=[o%nO1|%}wrA1Kǘ.ཏh, t QgaRTƈmLP<5o Y@!2ď-3pX .H<8Q"59Blh)0SIa3=::k0Ecny|wxS1 vM^$Ń)6HrIļO8v3F~hE>tC®75b x P.̏b+J>8 &ELPHek3񯧝[S&՟AG׈bM3'bi~Zx|.X.V$(q`}ͫa7CyDAG&/J( i #mqW;kg9cdD[ꪘ9W)fgG5*Ŋ%%qd Bc yvqĬ?GGl2ߵyWuR8-Jk!aSE_O"tzƷqPM{⺛Uf%S\ h]rČ  i$y \J&@t״9 ?FS;{dոhqWpiԌe;2` 06^C7d0.R Uyy7S$c.kZI/Obk*^+޺Q-"z' JAϭ]ʡngx# P]Gˌ BjB ) ʠ) u?#0#g㫭cyd>Yq{T{ [z\f5L /PA0-̪ )>UFe̴Q$! #z2{6 ~1rY%i]~vy#dWs!Z|:YsQLkk{ }]V$c_1aٲ"_9#`N$1Pq׺+K yީ&#;8чK% acQ1AV.mV}–<3BȰ> Njǁ8yO],pxpqNa\A8e*;sƙkf7hiBd#H~*wzs;~n8<ϋ!ETW·Kq\g`pV2` ᧤1Zty)lO]Z {;q<m&֩D#,)yGRTDg:TCJw{w jxecօzP(+o8f~U&꺫pjHsM@WikrAtU眺vhqo 1d,@^_M.Hөt*}5QײKSV]/<ĚZ|Q7N/ZpvqЮ6jsgZPg[MǕ6 &f g8}+wϮTI"sْ[\O0BA<\ ޞ؉4Ku3km[a(xsC1ŮHR)ńsZU&/n3'35ֻaMy׮U;{ C&C43Ӊevn{1 '|H 2M2'!"Z-7JA؍L@ZF47T;a,crK(:SU*CtKzz=}hl?PO,q^Ӊ~øzhO<&0 Ms&2'f@oKg2hJ#\G7uL%̄V/aa^ zo(^JWpV, )J4[Ax RQG[N`"PNh 7Ks4cn^m)iovT _.Fgߐ8W(x|i=sآ|}?xŊŊŊŊEϋY~r~tQ^9ÕnA/ tR"* o?_IoEtzjsQ>I{c03%)m$~RFĞ)&r|ʑlD ওzV\wlׯcWn.?nR&nRfc]*oKKdPQjJIрfDPpK$sGbqӉ㦓z !u4wS_.ofxo߆뫬ŗWRQZN2yb-69vGNs'_pE@hiؤ8,Ė?!!q2Z9cW\W84## Qz-m?"x}yؔ«\ʲR DS\.` ,pȢ񨸸8HfC"ADCJrʖSn$llBRɅ)=zJFRFdL~jXB@@BB `WֹX4Ppn NJZ^ ]`- }(Q4IJMSVYh v;\nAAJP6A2%PZihIK_D$Z<zŽЪ_ -ȶi 6Z=rG*e[TZ EGEMT¤65Y@SEQL0\!$AւtRE:\*hөqp!h$akr5LP`*KM#F9JR, V=>l=8%O|Içf2BTw|S?=Mzzf:=O!J+Nc؇.*ύ#}= ӏ~ߡk>]]0dAx񥲟%/PƓ'F|0yuARoӿ x'='~xD f‰x_p)p}.fr}iՠ$W@DByzAVrTh \F[Pk;btv~F@pQA t6<ʌFryng5+P'z*JTX砦l z Т`P3&C $h'[J$2R5N:*js}+NУC_JQ|,ǟbqIDrY1"e|NUhSbdAzcLY')u,lImTVe1]K91Aut.JKq RZ.Xj ]%κ[S{|0|-ԍ|ݘゟq-Bi털U :ksVȹLPyi߄ѤTMZc TsA'fv8y՚u{ rZ!~6OrгQ ,%V9'[ jJYi&!ykw_}8\cz>_Y_{Ѿ>zTIwWkYy4&&hL.AL\z|lzx|BoӼ>_ˮvε:`k*vvNڅcфNᠬ1{;5Gk>|};KF9V ",Q"ˍ> 3V(G$e fm s(ij|(~1]á PjVR'O ` xtx(FEoE:ŷ/Rr0h#dJ$R Q9SzIg0qA;/PK 1+`p;DJn|o@.}Ô(s՗+Ըmw|?{ ?gޥo^qՠZ|ֲcLxB-6z:)])R>v¬oN!?.4jnIz8M>Ihde]=\An/_^6vTfծ8i葾L>5X7^ǭa ih˒ܚ9ɮYU@xJ'ɔ1I(_T!FAFߡ<\U[ؾF1%+yo_H z23#5jPŹz_=Rmd"v銑i}=63ur?t;9-{?">]ԗ ]>ͽ"q *q)L ;v8 xI(RBeuVT%H)6F;b{=ީ~\yeYiڬk!͕:z7Y"qkXy]#ۤ{M-}ARo6@A:jK]Ev]e?{W۶i}ȇ(FE<6JJNܠK%RZJ_l9<{ %w Q}\3%cC5^b1h=" 15C )4cM)@1?g)ȫ^QF3wJJA@!:Oڳæ;++$&k~e!V[A)o?߄UtmÎƮn.2 E+[כɝ[a@rv~l.s= HrmQ=:1 V_Mn9jvrHSR45XEۂ+jL]KIL35$tfu71m` |*:t" XխS/F]J0eI4';6Zu"sdh{Ҥg'w…9%YzH[>m!kUO]:okMhky$u͖L#Qʭoۉ(zʕVܝpMUvL6giՅ$8d-l0h0ǣW# G#^i/>cqcM'yRX_# HYDz*9![ߵK)\֗bs҇dK ј2ZT_hMȗB29控w{:\2 7LT'Oκ՗ drp[5t~ Pҳ8dү!+M 0w7 ڐ;|$2 H#j}T?{YI>|`JrXґxgA&sNHt!G9qk(>\ fzWspc&>+iS}V1DoSDAoh{Z{vyM֊3gPEb$dA.6Ԙǫ)R6 ( uH|4꫟!<,sBk m@J(|Lr+f64H'W#RrM!ۣd-#ZAZ0嶯HY;Yp\0kժMw )xSQUWmNLM"q(< 65w ⓕʐm\~`&AC#)kQ hCTc'c/d9 ГMPz 9?bSXفu/mJ-dfW 'K@?k'e aI C&lQC)u0c4pi8X"rA(ǢK(q]&?+p' da 6P4:8'ʵʀh_G$Dyz)aRBZ@ZA0j#.s9{5Zs9{dwϑ+$osD-EO6ll-v`qmh (8IwiXy̻u ~($b@Q__LpOqi;6EEO:DWΓdS3"xO-,yVkf Hl.W`e3K|uLȷ=5wYRr= mp+#sB]n(\(Y!yPwլĄ?~EO-ڌ9}'UlDŽ#o#k/6+?gZ8rN,@TH(ZY̭3%{Qh./]<Et_zz.]J_r2@ |fsgin=`ih4P.EY0B7X_1?~w8aO0_UQ?&(~2g p2JvϾ 󳷏c_]}=E`??{ ^U!taA3I2Ejз}/~M;ݔggFe"k?'δHdUI~ynn޾yov_MAua?O_;˷/c?oD__QLU8'^___p;<~N~' kuҚ\37c[{jE7LoؾOٳo_~+^8|KYJĖv?~"R0\Fx:ZiE(3i5ų7ţB֣^T쫞&r)Tɥy?L ].'ƬSc V>̃g.rՑ,KVu_(ɧ aѫ0G{Gp<uS=rN{i_~>^:Rq2T$9غf僯v~8 M瞚?PyA6 TXWDJ?u4|=MO?AxS~y񔗱㖐'1M[P=zqV\:=ϣ(~bd6$oB1{WjIdү$ssKЭuڴ8rOWt q3y% 캤!3WBVatX,>w[FqYoh ʤ% ,UuM`=[?΅l}Vڈ~d3z- cnwzv?Kgko)HkjsjP0Lf SxZFAOSJjlF@߭-F 3#,Y3&OS !13VܡKDM|5u?jS_XZcK{T3 і?БwְTW CmgqF\P,"a1`,ڭ&l ]]D0^g㾜].CNo r^ " 0 j&ܶ 4\El&펁~tiʭz'^0?Ƞ>%Aq3Q7cMZSfA6.z?)sEu.z}8L켗K|Խ*ID;\f.d-{Me6[1TU٪CE,.Ic TjfgCw`B߷Nm9Ip2LiYF=|xlqsI4=?M|4 ^9ݷMKҿ :&zEh(8WYOWg[ߎOQ8a m{TlZ&] 7aF"] K%!6BƮP;p(ٱ]ly¹,I5%x^uْ] 9XǹPD}R_n~ۇ0*5#?1urM>I䎧 `{F&jB'O,w̆U\d:> 2@wo|58NLj%(zr)fIMN5Htگ'mE.^x#Q*2rшx<;?^I: PVM6% r Tg :e8)poQEF-@B\@B(w|t٤5qAW,Z caM #fDK3 Z$_ʈXcb(&cD(kcu$^)$#-%= ЦJrQvX#]J!Z[$tcެ.7Oa0Pw o;'+6lfW  yI}O%~D Cpؓ` L:Go. GuS3 TN06l<֍G-*Fms,0fT>P@eF`r H0.d v9sm A4EFKB@e}C:FE +FXm jab̓ O侯Yi?DMг&8۹IqN\HWh1E -B9N8Q7Af0|u9h.'C:)F>E@II.>N&cϊÚQܷВ0,&E λ{Φ8t؆nlJ4{ȾnǏ6a12䌑$l2Bxͬ$w("7cݪyȍ..:@d6SYs\շ V~ PZ79鵧+wo@5MBX#!ww6d)X]7}I&>gw[+c}w~_>m6Rr~g;rBuڳ%zCd|O6t'%3^zj]@lN8 'jcٞc"m&}0[.9 DHVH U] {܍dmiڐF}˝buLt,^\^x YH%"]) pCEVg7?Y5X?̔Yu!eZĕ ,ʨk,_R 5D 9 \Ŵ*..YAS|dZ7@Q+ ?Is!QñVl=4)jo:2Q&*AV ecbѺYCfpWoil-˄nj.xF!Ǹ`W='ǟ`2XΝgzH_!0%}C,6~X7>DÓ+'5gOϏ??R'5gdtRS2VLs.soR茌p#\(yqӘJ\jɘek}F C]'9>2>UrCR 7-&I&r1w/gֈ螗 iLTsUvi[~WkX b$:ı wH *j0ө'/ZPQ.aWN5 Kט/tQIGLjg=JD`Dñ';8υo%:ǔ;, COO P09Jx"sm=H)dTHyb餓JSf̦>,Z*4H=O),Y!>q?@1mQG{svA]H/;}!ECw[[;Hڍ4}V^GZ.vz8UpD}pvU`$ntup]8t3rѤVPq-!Vpeji@#'d9])o~ nnߛO^Jg~ݟNtaQ҉tp?Isx kV98[#SbE: tDxKai}8e~kq֙ð)(KIS|&A:ͻ ӞLwst_V5~RkV^X ~L +x/ V b?7p¾VTl4Z; ?vt<,H,t^Qդ^BR{4Zl/YxV" X{6e M䐶C}K U-whliZ̋x| kM'@ӽY\$":c31x=;**}_+-JPvNj˜bˮVj'!;\-ZbjcSfp/œ]yOC]#o]=R|2J \LU \E:ʝ0Ok!4=y&hUjelh|,LX6oUˤmH͛`b 8[͈E6Ÿ?v/EKRC49 uIk9T W V+vu91Ϧ=Ύ^~z>_|؆{O1DSa-20qKQ ~`.g(ˆ~w_smYiXfx6iJmj j6q\{MD Q`.ɌJ 29F;+JV]T"\EFoYxHbV ٗ/ָDC=vҾ?\`W a]L}$7$KJp˸eNEVHwwz2wQřgfF8xp$n빋_16&Hx5HEYdY_=|6},'Q3[8}FqquЦ:{B^Uiv6>]&11y-Ϙbe.Xv>e =Y< ӟ!S127ΛաSVlBC4NZ} cGA$C[ `<2қg+#tpahA=, ˢ8 %iI 4 F Q\B˹oMƒJ>T;1uQٝ}Z*|vc7-q9v3xH;W;DHۣ|L&[wΤn^3| 'mJRJQ蜍( .,&`$ZK 7__!FɾB gvD8CAL$)c ΆF+.GQ:mDrqj :T+ v*Q$JqK?S8,AM ]̙sB?B/$Z\(9e~0ZiC/|޿n~P/7yD~̆4DORm) P%N0<`v4N"Tx8vQk֚jPT1RkeNj +*K#G34~IB]*k8誗Z*?Mz 1OT-oԿM1{0h^ak|HB$:" +i€ql 6"}"k)a,DfA(BŎr*~F2dQwZQwUZ*Vq HfGd cg`9-﾿;7x/LP0Vf>g߻WnW}Kq0Le,? EgZ }?iqKY >I!x01u8 1 Bcz1VQXB):@mnLےΛo+K/ktRE>"=x3{wE`7Ԩ{V$Bs4,2FjP1T(.*"'0(Z0{݁jyu[RTL <8~Lb8f5HǬ YŘ"@*.B)s) S8̿o+X)5焗㺃-!tE*pJbwevWiwevWݕP ,2i6㔦"[N!1gYڃҢ|@D{p@:Q0+奕[AKJOxia ՠCBd{5䊗6 M8*4gMnl\0Jh.R@>ux./kEy˧uMQ5VkKncSso]GS\u0|3@mA\@{T a( "$@q, "ĂW !NwtU"`mR R}OmxN6] Q58A4J\T)0ĉQSZM.: ^X!trBHCiꚳL@ɩ.r0O½Ϭ`Ȍδ0sTr[pZSkabqE`Q:qfX8( 8' s RAu~| (;6`{iv૚9͡o_ PTrߒۛ}Jr RMw !?=^XL ?zK5L\x7knFDI}&B))~L[Ku,_1Fsxzxwmx_n%N/$SHjҺ?ޖ~/#8K7uv-#B%,MG_eK+VxzZZssC#I e:q!cj9jU@N.r!%[M$9' B蔧"GKQ E-@ %kֵDR[,!\0ZKfԆCP&ހR rcƘC8{$4@N;8e>. _R4^BDbCYp &1B "lymY!GxWIh.R@V kc;ܯ0U3K FdO0 ig !4ʁZP$@l6|౎WVl 6'Bj$H\!<{'J?MIJjp'HQZbo OJ V%W6Ԡ O"֧d#z3PZ\~хpVV' ъ<@C%UEZ(: #\A@v#{[?C 15>I  nɜʐ2K*CrZMU8'ssZ(7ie/"N\dG[!o[E9׵3<7WHzQw^q Y-[yUI2f$6#hZ3%3ʹ(@#%:vJB!YI6e`=?SG캷]+(1bp I8{P}g /1j쭍X[v |h)7p" 'ZT-*VR(,l?D!sqr dlNN8O+#i%[^qWElAow8uY@ϸQ9!p1O Qm{~n]$,PjvEI&[%rM9e *d-q>02Ԗ9s̗0e=rLj>X'ha%4ZBYj *@PaEa$!]|brg#r Pˋ?_>~<=_ڕ8N-9))mT[%-w&8 jEU%9P2(I ݠ.c20 -bbRdԯlN0_gEa#-=,ݧ:%)62`J'quE+2.u!U.EJT4Se&wTlpY9LB$iI>!F[8evpLhU¢wsC[5Y)/՜F,:&[(I^"-:ZQO SPIK >qV86t1qBRm:E%RiIJ6RHΠYRZ.SeRN^sVJgOzgGK&3GEindɳƱ 6"6,,E*˶5mK^(/ J´@.KP(E<[R#%:GdX&JL^OКGcoyI$Q@Bs;/.NHӭ)SDzqׂu< k\t *78Lnfei 䌖$#9XEU9Y(8Qu0.'UAL{LfjRP4B9D&d4!A BuMzzu?tеiN&Wi`3$hIC UJMd΍,E3.9Y⚓E~$)<ðc;E NN5b 5f *@6^JѪ5.ހZE-ggJAќ.KTplx)[4#ٳVzyEʯ=]VC[#(ԞA 1>/"R񸽷8?;=13&~#s=E~t21DФ!޼vJAa;G!<.+nja!ඇety&t AcP e>ݗ1 {G mBgT1h84t?vh!RWŵ1kyثrkei#F;~͟]ҳ6{/q {m髋o?-2X/7/3>V%.rly]J◃s b r5F]]T3?MOvd+ONiH]/7yHn+NûHZW(\m7ڃsV`hayu㺕G'A)!@6gKmxo wwSbZˈƺO_#:&ǘhp9K B̡ $zi{iZQ~ؖZ{{qQG/Z֋3şŖK_..{wor^85 yWNh뇥 ֋~c)Пˋ{7_h Tʨz[h}u}[ofQmeuhl9{u B8rS4jNqNi7s#jX BN7hcLj>fȉOѨ9%$Pjϟ^W(9= W='-ˋ~9C?+>KrNZBU9Dao+> &B2Z[l+%k%ní'KM> ]F h%|kw}Y*,.Pe;pd9Ƒ]cXٸ\(*D*˚̌c(gHb:[W5ٖ%}]:jJr~w(óZ ik(Єi3g7w''>~Bށ?{vA?wˆ[@Z2rGخ]^PĈڵ&ᵟ~^Ix]v d^Ze2ir"EYPi Eeb(J\䙐e Eܮ=CQc?~<]K3_suçſEom1+|=߽~3A߽~Eʩf\oО)o+yM~M[?!5Ag-X!la/v13A`Lj417V o;gq |Hq6Yq,%*,.ͼ'o^;fl:δpt7F 9th{ޠ7z}H|2(ȍђnh @k7>H0 R5LG ʨzH^>f3]k͠C{H/談tFE5?mԮ]+g^ J]mU?KlQE%K{k+S__zjLi˦lo9j[0.[uoclZ}xp8gCmT]LP~GcHj=qxEq|z%,C0cj[Zw C/J`[׻m\}V?>4[ZH"Ƶbk%3[*-uEՆ!w;yuB>P3)&GK&EAhAL(/)81\R4 dC7CO{ӕpT.};w 9he:p6ˁҝKデSH BjM6qRtV!G p_nIԚvOJ^X4|mAo ZZ9YRa *%9+eDi9`,'Y*U&AA$j4y[TشYw/M|[QŘR0er% dPTQN#d]Fe9 ]9 16x=:|pmTZ(U!f}/j* P1kO3߭%uc@e" ?:1Ջ TQp"y bj Q"+]$PKIt:i+8іg3;EYOPqf"ԀG3YOmE'8ahI|g!mAsYOoK)8:4{|1хňD ֣RB@{>jZ9Ss>_ϡ:#ҁr+Ui[ͨ,Fs r!|'>} \4e!zn0$5x~s Ō晳ҕeQT)Zq0ǬYt–e^ԊtwT/q7ȹSZh`(s.#P%eٻƍ$ۣa1382$/ ؒW=_$۔DYMfWjaZ67ɸuN([E#zKXV]̽Ԟy+|8y:l=$\=6ꨭ_)?^ y[j 'uB "] :,2so m ƥW:c%~RWʱ|jROe; G&R{1a]@o؂,DTߓ_n-x —.}7 ػ7Q Z-%^ h:8v4~sWq7@j[3u4n3Yx 'hK _<{gL2yj*+/|?RN R ْXQ%=~<3Pj!˗ H*Ci,τ{5*-w[%d-Ǣ u 2}ρ/,IG |93`y4@ocF5ʟ+0GKT,Ld`^[s;uXAiSrtTL./0N&[əUb k% h[I"Xڇś`U+Z?E&#e_f_H΍2 ]̣hB/f;]bk dA+< h PALRW(]aqQiX,@yȞ-IpFKvc8WGq.QXwKNp66>Wg"Jt_س8J>7˴' /}mϜŇ]'dGy h"ʙ!=XW1Pʾ%:M0{H>L!<?2`4$X3u)Wi1X_@q23xqԈ ٯ-& 8RqljNXo܋ 5,}+cd걁QܷZ?F*0'} #MPHMIBR!QՅJhїܯЎjv1֪NlN@R-A7U \Ԍ܂tvUm6՗}\qpA{—ć7 e{u_MݯBl{Qync ozcӝi_ I5\`0~҈ zDÊNb]'mXd!Y^j`0t0ڒ;~"1c{(`+ߏ:ϣ8/f :.;sc'<9#<׼sm73'3,c2қdml}Ǡ*DQ$w֥- Vp*?gr5 fKJi`&IdP?dQlY!UYw 3oұhrO*5X>sj&ɾ%q3rjtd<O^#J ?^F5lUǥViĂW d2`' -"(.# ickB5-m>1&hO^_镲.9tA+(E+tN(q; PT wGDbŦM{C\. &I\.R5ٹL 0I⣘ yx)Rjr7r ooC ;:t[[F][⮝mG6k @c8y3o|i6dљz ѨgLH]XսGg_mg5DO4Q3 10G^D} 8:Nw* GHwGODhZFÃ1ݻ̻w&XSfjPIt ڷND%|i.ᢣimGZ2F $0BӞ;N ^1Fʭ\M-gctD(4`=f Za"'wBYLStg=OG(znȳH"!4"XPLp˱2!F}$i0!4%&Jk :4ODg{&H-` ]]uN)C칒Dg=)dL2 &0nx qVXW8&kkI9nqǍuo|ωnF" W`*' *]4`c45x֘)4755t {P$ZHb\.TQJ5Nׯê6jA0!Rwf]nZua oD(jLQJЛvA) * k,N@~t.n'wLKIYRRW ÜJ"`V6~`#(;M XdH)ꈫP- XcԎ?n@Ơ#9 l, >9AⰧpMh(:j"wTt's8 Mpc)ӥV}f;D`j4GjnFCHzsvs4~(-iUJT3sJHxF{rPa @1h"Iݦpuc ֬h1 hܬPRlv)*!CEA3CGDTRA:gNJ %J  BIa% (YiOHdݠTb][ڞ<[NRJqJM):J+8R<5ռ?LASI19߬G#"4b*xdP=HL`i >\-ǹtZ1tJ*A 6눏 k9!+ (id wKU0[A17>M̈obi9[K"`R48I9nѴns{3ysqh#{zeZZ)\={3*~AbO˖Zwxg6gNrV:3Y܍&{ԁ@",#^zҩhYƷiUP$Q&J7H8`8G5 *9(\CpG"I>>9ڹ?[;4JNt*v4BGqNmŰv1  7,Ncқٍ4h|cޔxq58a5Hj0=B?}:sU+:XJERVٸuli] b0xƱ-cCvHۘlIg0"7Wi֊#qjUX]Ka kw@A3!C!:NJ4&\BZW7w߷f2)ڽ̑bs9_.Ȼ}F]];Z'~q.YXxF!CN;gn~uXʐV"1H= c&3P.Fmv ǨG%;GH&z.0m-X6`5qa^, E`hgAlZ~ c#2hTM`Y3 :(lDLJ{J$5TPj&\m:Lx2ő[sIqf$!SΈ e2h F156Fml\%X(H:@)$[t%bA' ;qb kq“g\Ӎh&9-W8LO2 RԳ,(4UKJǔ<-ϱ] fi@H3 &z"`cIUD!"E AӴ/߮ˑú3XMȴbj){k' ݻ} 6Rp |oPƙ(gwP.8f=]Z6o4}85e,]ijx!NcçpR.a?'fnJ*RSjήε^XDC "U kU:DqC8"۬P=LbxF N'wdzBONRlB0kM &*:wYbu8%DJ;Y(I%ʖk^(esr{׈DVn؍fz)2U} W!LGהNn+/93 0O2WHy) . GeUs @ĚU?oښ9[9/nuiԀeY1Y@gqd3΋){w:A*9A^{xS>[}VuwgT7&#pf諽玷.x9"]OXvl8QvwК*Vvk'`j^LA<ٻFndW 6~I,fd_0H6щ-Hg'Sduie 2nVE֍UEQ jY8 PK8>]լ>یj2fDȔT.8#^%̳'bUXY>n$m9W""ojs7qGfyha)C<-?U j'5zl9|G`c$: Շ:cx[ꨢ0,pP$UǮۃk]O=pS5KHC~M<:|LxV(w'"V2?5JXVQVG'ӳkaOǍ/[4 k-'| H#'|pkSBN JPԖ'z:݉wv:}Fo Vc\3؍o%)q:L-t٢<:55^ˬ<,0@Ԝ3p$Boyfc90}/^SHU% z$'l_xh#&) D&NMhvVE܌"8H3xPЧEOKk+ Bq~ 1}$X&w<ޓ^[ n96F>_ޛW XpܧSg §5{jvGluDP&W}ҨL6w^hD O'.khGU!WZ%:7+8b#]'Ө[3ZͲQIQc1ib7_«&*F\'eW'HMܥR!OcfҐmefXHY0 &alMtq~EIz)6u\Hq3zD'Ӓf #i69嘯wVƤL2cZ.$"HV*@VPe+c+CU=jvU)$(eF&Ί(Q%(W$8}k?o>$٨곦j`r~ M{gjOT}f#LR T7Rr J萑KЈoEkikPnaHJVurKXxoӣ&:ž-ȳ)^*YscD ߑ92TzI"7 ;<'ıɄB۶T[65BY!g+ɐ(܄rFkЮ};IYu6LjVh_V^y_Gh֗Sy7JIOѱ -ۗZ9EŖ4-oS4"O4/nQKLYwO[JG\(E\gL,W,xxRoDC[47eɬsP3x)8G>c\Lg߬-|~z{& o~zaPߞ) oojAUh{jozP u@VJWѐQa5Gp4mގz.sD aNd2W$EG7FԵuGx|j}[-;oƝrܷȣ=v0Ԫ{:/Gޔi{d}krGPHOwcF27Bk^уWϋ<Ĝ7h0[I#OͬL}M˿VHR\riK*)D<kxo3#e0Ak!G؜PL!Ȋr<.}>z3x73誐Yw7S3 bqoٜg_py0kO=C"}iSQ/mʃ%"K/dK˼RQˌ e NP0挑Ye8Ν a+%syDghҮ;Q]w6ńHfҩdBtS#:z#"ԣͥuTp{IA¥]bXrjgYoP#Nr39Mr6{/saG$UD#a5`U&7M匡B ۽+P >%+jItJ.Am8B !$e/?|ˢ?\gY֠% ;4`0 l@䩴`ʩTvTUڒdqVwZ/c 5X-&4?LY [ Kȸc5ZXܸ FG\Y5y€߃_:01Y1~~ Dx2]p,\uql7sm&"'wn`r?H&:cB Tn.i,|$g}qC5"I =r[bj5Vנ1K?njVr!3m7Z kB[j W%\=zkF<=I9 9z .15 UE}Rq깏 :Qy6Uި*otpD!pO0=՛/#-[@ԏnΩ`xߜ~V(#|j8zo830G5I.Gw6<|s=N5O|ZV/le~BD׼Tఊ7QveTƕaspe}0+Nz̼I&{fd/#DN·2muZ̻& {-u/ x F,T3ӼC?glf;,±޳jptlQu^l{  ͈}ZL6vRjRAES$3Lr!hc2&JHz@wPX{Hi.SG /Z/ >Bxr% )!fC%T%8};,DX՚£oJ?jg=-UPT \M`k^V#)ED/iTZ 2ZQ *$ ' E1 ZN:SHjBㆇJ%DX4tY@PjS\wx[TD*96%uwVuW.jxttBŕ4PwIb$kL[L5 ׍Q-Ea ty:{_'[xn5hjs m1ZּDرou`([KE{鮓+cs.n&*zmE2Y8KLh1+ZZ=H@N^CD뫁'i5m$Љ*;!R>h kC;W|hZ1_w- "[cɻVl':ଢ଼ōt|;ʔwa`,|v݉ Nʐm5YQ(QvG`mk(kqZ[rtCxQGKQ:# .Na3ʤ6R$cl-uS NI Jǣ1 9(|TsLNqay˷NUY"r /s(ر- 9ʍAFnɔS鍑!E28jfDYEvoߦn5t0=Tn>s,\7}:g n?YZ<9|NFq~4 hMk9EDAD/o!x 1;6F?\h__R~1aBsy9x|E7jeAw6!> Zau6 (%)H"(Zi;j7:1eW5b;¨n젦vX_c!S/* 5ox!X2#ɅJ9fޚz5L u #9HO]-Dj,{Qp&heJ)UIk )qLP1U (%P ST}s%4v I_kHZr 1RJуOFzb TV(9Bupb0fTI_BQ2@)Z, pPKe) ݈R\ǷT=n˿r4\H#My<ڀj <|"FV8 8rT#n/b"_'+xm,XIAp7 @@d͔r]ȧ*փÁJV^6'@GVD@2bhPE9*JAE9~2ZZ(и9YKI"Pp6)cA&Z JMS:T\0FtVL%lc ]1HohWw-pG{]Ȏp:ύrq8LЙAtuԳ,qdY|eOT6~2sUjZP+%eeONJh"=\JG6@V#V#ϑ5a %PdCúH.Á3[T" mi|M%1kK4n 4;Nb!%M-ϵH61Z+s[|},}?peB٫r Z,ٝRJ 핁l0 ։h/n[v'yAQ@%DCPr 0,hɓq Z@J mU1C( N9]p],EA1}[ʓt^Peܵ+-rA鱼kJ( iT+!C/Jp?%9p0z=ssJox!g}p^Z}ǖG_i:OZ'pC[U ZgՁkfhN1z^:G=4%Sj;D; A%%zRMN0"ۘkV ƕvpF;-qN +ޓ ugMAt%CB73t5v:jw[ӹ}gENQvc˅ki@EX().1sR\-z͗ V-A|t(_n줴1a:t2fOyzL;!gCۘ|SOI=;"n&AKNX<me82<܎sGoFs;kX8_4 8j̗xx/}3sf싦ϘS_uR>d+Ycyn#ޣ sF>}?Sܞ}=oHҼC\z梔_>i>-Iȑh9өUJ#{nM1":MQGyj(ݚ7ڐ#2$=)6wݫE&%dL&&Z #JE|0zhVz9'LnbAcS'߈VkZCQ*\vv0Q:[r0k"b ogdasjD:`B1--\H뷓[1du0{ z ^?w"_M`>uoY _ /*Қ% 1|FpuD*-! C=$! .eZs z).:%Ĕ\IDSO皡 y @r^JWɡ)вH_V0 ven2Lr·Y4kۋ24%8On*T %bh䓛\bek6}f `t)/͓PJ:N=AnPD 4bo>K?ߐSKLZ@AbO#9;u[Rw jՕj"ˆSGE܉*@zB `֟lŬ_ k7;'bym_O~SAXg zJ84^c;4T昆CPbq2*fTжf+Ρ͡q!uY`r'3@UtS/ 8C6=^Bjפc"@rl&|Son?߬G{.9)-S1|Νj7u>28}u=kP9z4lQ*%EG2ϏfWsߛ K&1f53oF5^PC_M-Alt{?/ӼN^zE9ꜜS_ia#\ګߍV."o?Ps8Ahty!s /0{O>BCVH뵣nUcͭ/khض^okw@D9MCv\W!Ү7.,{z_Q߶_XJ( sa@k4H@|2H$r.22Es24d"-)iGKW[ў((ˁp N'-j HT о0"ʠs@ C._ږʥ3ؾ@TmRGõHJBqE<BGBBRscjf'͗l_WPTTs$h4T [i6Tڢ e)J !zHL*``@y& ;{$~"wq_%fрY'8$,}ZȻw)Ό䞗nv5 X*z!nn{j-<".]_7w_3C![E|WpǶkra&kra]XĶQA2Q`2:h=]V =a iFнIKj "WYį9K+zbIBH )Ĺ26*5md.:c`~ EF/%\̑1C@'ὄ9F{ ME[G1kG,뒷sW 6D[.'P<>qɳ$DcOX­]hv,&HAZ Vm&rQ.Ũ(a!7 Z!Tˆd"ڍ)UծK hoYTq|[8K|(>[$Ţ+G]Gۋl q2%ՊRltc: I +Ý`E,LDNHm; *!1wD$J설#;!V$mZ '['$ C *f>ӥSiOOpRipT׳JpHpc%u`!\ʦKIuI#oM#A],abl7ҡxCúNNv!rIF;pzԅ>Z!{*F=]G=>=#ԛDAGV>$$r6%FP݊O_i=^s5l[˅{NWwrnn~&yHKI.Bx&yJ5k"hR\ ;ģ<X@ :w25FaڨoŇ'dLX-FNtXqX=#OX-)z9]w.Rht'DhKe4LJ}QlCI]v<]ǛnJ0T{أ &g^lrU^G{~?{\dYv_kfj#]|Wb|>[T[bsB{l*HTTC=ڪ{S Jhv Ϗ?TPtA ?R-ExJ|\I[|Ȋ|xr8*g[[@u{!9n<\Am_PԾcMyS9ճjV?߿ &LJi={{OE)|+}\Ddz?׽c_SOj1(wtn'-[plV_P !!߹VT+捔zՅ}}~ oGUE{{+AVN+:wj^H5?'Z(~sdm Ph;sA̭Y>` K4P۔Xu )K5VZ Z Ji;x[?rd EJZ檓w'b'nN;hS3X=R6|"[6*rHoH`'&I"Fn9Xk"~JjN N^)h!ZV܍x o;haZb0UVdm2b5Rk./htL%)F:ӾMuA MKR>L`1!Ya!9/J2C5qXcVZAQELѠ P,ASgkeQ'oH|l ;QgXXޕ؛^^GfwIvK,Ǥ#Ak$j(3$lp揼`U3J*؍!,>D1 I2(T/Kh\pEDrAmJ D`Rb5ϡbbE b:3Oi{Fw {; *!jxg:D[a0qPU%Gk}OB;^P/`8Ҍ n2C(Z.bm+) N/KZ{Krԇ TޥSoy_/ޔ]}n\ Tsȉac {BDGS(>h'M~-hv4jI6P+ Jgk@DI΢cո8 ŭQUXgcSgND o$%rSIP$Юg1QH >4sJ:n ˷CurZ><{lϾ;5W/r_Sp kA7^ BcFa|o/^^Bi>:4kj΍b.Q{&[ ›F+9'bB솙3ָ|9q ( V*dL&PCP|nA}A?t($0:=pF-O64$4l"s:w(s#M0)e&! mws&̬~)dc&i {OWl%wlq\sNov]-r+v. KB͉;>KP Լ\eo/!#| I혊v抂r7}bG1kȍW|c<|aN5.]lkޜsߢK5d 9qj;#qB8ç'Ӌ_\4iTeɹ0zܲLvufR'{-<4]P4[A8 [fd6l/;ϞƜ!4LeMSث'd>lo[XpT`ko48Dbp))TPDbr-"JA2q\LQ1uB1aF vC8y@h2ޡ O DW|HQoY^MZixs6[ꐐ myv5DVՑG㉺A }b3ΖRb*IgYTI@(ˠĝ"Qr.SZLAr\!E+KzΏbn[zQ'r^8[cvWu꘳۳iQp͘s'(  |"]-${'ԑN3v !5LbAY!): 2D(lr?C$N% )MRJzvoزv/L]i#4 ^VN6Gd)US"͑wc'|i(6| N_D]\sSOܒ]$ɉ zPlO3 A}^|Pǣ?q@@P% >w`Nyx&,ŵ:~,;>/W.a>*d7sY1UIVbhKBȇiEH ݫ p 4,A\2wXm?뿎"N9^t͵]dtHAZFb:R4r@FCHh͑DRz1(ߓF"|0}pR]/r] &e0laŸB,3po&B#HDc;fVVn:HƗ)AL!gji!Aa`5"&fPä0SB(_C6cy3܊43HDk<: p`!P(£m=Ҩ 4vPYaN HaT!, A;ŕ(iRa!PucZo(2Qd%$J XV&zI@@Z}`Xץ{&B8ȑ[,bRP ,!fx!"Mm4~@8B[PVrA f u,s ױDXoaZoգ 3 ґ;J2hz(N!`ABG zK,-e&d TŁ &zi%L'_wӂW. U|!eW}3뷄2beʼLk$evy7K49-져Er5;<-;bOcf,q\z{/j>ݫq#HGJ Fs7a2㦅}XEؙlĿH9@D!M1o8y W댃&oL!̀2 ZT[=Ėm*FA1׆^۶wQ^A;Hcj/Bvڙw>rp}pP] 糧4dC08* YVك"Z.){ŗ!3kFun0 Շ1ik5ɨY9]cb:5]'$)4,Zް9ɪa F} l2Bp$\.È!F##u0 X'Z[hEq]k}S0+1Ic*5_&q*~Q9~ϐCŬd_\g e߮ fj9P?<fFRy`BL}]4c,G8`WmԋZc9pP<85;k:i>uYYί#Dg:Z>kwvuc<E!s7ME!QK#ss ~0kKX (֊HEĄޟ=~Gu8o~ްa&,#L 2~SR^;0H *p`S Ҵ-44>(*v|UmG]; 1@ 8$ uc$G)P("3ܦ)@*kQ).WF{f<Y"#Wi)V;ebBv6E10L )qCSf { X]s,v!l sx&-/@-YlE}Rˮ"7TY>zptIj!@ݡȣ^;+/~6\[X&Y޹o~~ۉ"wjߑ6G~Hj>wzkh88wq8NA@//hB.z{n:}ٟNLx1S^ލ?Cgƿ7JpRm\rzηKP$  À.CvfG`6o~|GaZXDN 03ѸӾP $xzbx+ WޖW$] a/{{z0/n8~\O?~6wwiy-2oM /BV; #Nn4l YקMp.'#!MOJ=yi'Kn*rTb HEeJfM+$w]G舃ˣVH`t%_^*q?\=,rZR8<# u,M4"!D+`b9bk#cC+F&6;#ܞy5XDq(HAR F9Z0@R@R0Sh<oc-SF0Q e R2J 9L 5|Id(~,GQU% `=yvjS%J)(ڐጱTCiÑu0J4U ?]ms8+*)xIj*^M&Lܖ A[%ޏ?,ʢ,)j|I,l#PBV:w\B)xnCӆĥ#%F/m )V5.w1Ɇ+.2DŽwKjA0~~d)>0yG!5WŁfGT)>XW<8s"z:x}Cw>c*_/պiٞc.ZYJ3ik[ HM~oJ$W%~^_WyKe}O#(뫥-IpER'tʣ: 0P+}{MvMv#,njҎDm{s5_+ =ZjkmGGU7O\bĪ(‚"٧[wl[ R,m^H[ & cI**GTFm7-6A )~v"}7?$\Jf"MFl$3O4#0!(q!jZǮdf|ٵ{?ܚ fo D_KHWm]:VJ-wWmW`~_>s z2yHQk`姿XggOY, / T(bBɲ @@*I$ddSbAn8EjT:vԅBgۦ| Kl(/ P!:2ZK,Q*ѪR J͓c)$F$R2ߑӍLz(&yF?skLOu7lYwowJ˷|K͆EDz$]I۫A::/z $<\1>)Fŀ{;oDG[[@=P {XmHy% BfL ^Z 9xf]MY\-{Swl0*XKC:KDJkX+o62e k,CǥVxjэi:_qUnpY,LlEHR.J%eA*=}[ٝ4Kv=~O+jn8!s@+Ѡ| JQw;zOowtr/+.fp4톅.~F`oї<|>3j5̚m-(z$7Cxugh~g Fuoj,]~̾.}L4R8!t CSpGJs OCkp6m?>@=xl WB]  :bv+JJ0^~6/JsPFF8'DDF=YRacϩĮvGWJP>Od72ؑ9y؃6c*!S63xDrɿg9zڶ}PyRAW'>"hwwx|dŌwƊ;WB/ϐRc2uȽ!) x0uug(ꨇ]8ÇbV!CZ*dCδK7g۝ZXi8 8ӺpP!Бjxε% x wiaMzpաGn;j&8#0c5IZEFE-0a=}wcVn0vASNP*mmgO "$g\p9 ⚣T ( ÎםCUK<ӆΰ5b#ЂYl2XnFUfv!$)r#4@j!@^+dު?M9%9ҲtSh"N/2F^׃so™(aܭYiQ` ܝmӻTs)B߱ {x66Gv1{㩜N_$Mw+SSK'][?6_Lm1k0۵d KYf^R媩p y&bS6`8CGm fCucㅏi{љa!Dm ? ǹޏ?%z'ᩯ3x XRݼ4x'\枂Jr qtZ es%Usias$HhK-ѢmUPeEp}QbV?0ZTݙl1!.g`KR OOĀN̐ T)cC 3Q.D\DxnM}wp;;v3+^;;AW$RPw*7n5Ђmf9hD9& p .Ŋb7^n=\ǒ:d l/AjJB`!*IVK!L6,gX(wA*\#$jz^ , %Mtɲg>ǘ}w:Z,9KN6  Eo瞂KΑ`u<)ŹSON=TDL_-@1x BB20IB0s4S@ڔI3BqZPBdhƀk'*Oi#<@Li)Z;õ{zeLICSRLdLTP6*#&gBِ06BOm,E -͎U$}㨕|jaX+w!uWlAX68yX5?#>޿[~q!*ȥ7|k엡 ~bblhnP4ۋA>+?.F#dVdDBLvO=b@ُ-lUMY1rpO"D%uÃ}X3.'k2y]kx+*p1ܥDR<|j/eYJd2Dsm9N%:ٝ<-h-W̛]e%;aaϷNr3Vb?11I3TDe%iNuJR*vc RYԮإTͫ#0}֓ y*Xڼ"ЮH9{s5_Х%F9ޜXowj1"qbШDWsyJ|y#B3R#}Faq$k} @Av Ao$+(v A]$(as't]kY*yBVP=ۅ*P*6uXMpS.P"5a]3ԃ 1I}oC]N<@hؽ3n۶yk;=`mΕ jNQms%#%upLj)5Y!d3 C.wijqih #A%䜿TƏ| Я5L @~؏n)]\{AWX;ju]KvGWKWB^ٔHEx7K5GnN;xuӮN̻yzM4Ȧ$Ŏw#= }Gvm(Sޝw ?n1,䕛hMI}]F)w tBQǻotڶw n1,䕛hM5ogq=b<ќhG#5gs>s&بZZ t9;~g#D@P\'3!|zyDÙp^C`mj*9O8 1g^`JxyBԙ@ w'< N/OK= ï%1{z-~|-$Kg+zWv-@>gn|q?j Q|伹7vK ~z5 o׽WB%ZcJޑXPJ?>)F-Ũ4"_ 1|oÊ4†"H &+yZ1XZGTZ#Wbְ9xfHV\P2-`SGl*ug )"ې.XRlWR-~(w|vvj.B+ۗjN#+?Η^^3%NZ@8@ SRIkVeJ]|[t1kgY)_' Ea! 9y0Jsҡ?v"W;E4hjހ f\reHS3hO)[r DpKzoa5XxL>Iد7s}h s %9~}2t`E[ feRmYT(A!k^$&(2mPabQ*>eb dq78l b8_,0ҭ7UP&vIה v4Wbc̡3`mWs}rm޻Lw6XO\+ Z*]x1rs2_FCmf67U|'ݸW`tS.0Z>N\_~O-gw?}˛{QYvS lp4 ί rզEcc벋 xg-آI) a)L\$3 1Eyz7Jkk*ע*SԝOOϼlܫnvjaPPBc.vڌF$w;e\] ^tt9vYzvAc-i=u9x {N9ܶ:piBToZnΊ`qZ(,O&&j~w}qMMFj+zgZ;dMSD$ X9thW @9eT+;v;#Uh,(sLrZicZЃg-"v'$j#B\F= Nn_Kd+ADhW۩|B.ڸV0GURR@maM Pٻ-RV®)Dt֒ `,gz9_$T] wc'^؎pb}U֤!)i(rp(6`gg'&eCla5̱4*He4_܏iɢOL$)w(Qj߮/PԔ(fvYը_|߭%UwUK{^}7gWݢ٪߉yD3}镠p KkbxcWo@(jTpƽsKaƼnI,'*Ai`IWFXR(ŢlUXy{]NJ2U\tUk Pξ%ĪAS U\QBJ2܇ϽLuNh7R8RŶfV??Z,gwpr/kP6|Tv0/o7= mF%qHx/n+D^UjX=Fq5G-xXI^M*I6$٤d${RTWo 5E (|]BZ{!ʸ%}rd7Wu)w&l2ppsHP4H|%^jjʩYQ28!mL%K?TX2|Y?Q7gpcFY=HmNAȢ`!j4g"d&hֳ'w,ZP3]tf9¯3Ms d4'-Lj6V"wȌV4_vxw#[Qj )pB^<:KQG3{,waU-ڞm]ĥbLL "j2aFH \,jt%lwuoo~ ~ \R>RISVS*{,efu'%&0h vb6ZN@]J*a+<9'f{ٜ'e$FTYe_9-/3Ίry\nvH~j0gE3O{]Ch`E%tޢW}';v˫uciWhf ܷ!5`/vvV*m-SKRmWgLKu4--lGPT1*M#?NŬn#4jԖmvQqewLBQt{"Vl#c[}]{w4[/h=4Fz>8'>QlLM|m =1X#PfW=<|{\)1XjѩJk0њ!Lxbr#$/^r&7$'xDt>˖E99p^5ٗU9_qUK&fz6 oo+Ws/BW ǎe(KT._2q^PϮ.9ohr37/T]:t KGd(_c<%<!4zH_9EѠD2)[iiiDKhFx $ɣz$\5a8%ҥ.D\.bffΈ$AK.zET(#Orңڹ 2ʰ씱K޺ 1Ot/63O<ې" s;DZoI_"c\ĉANjbUx]'͓FmYCMypTyMir-d4Օ* V CJ KGR%!&~{<})(cJcґuKpR* q>rmgjM#M0ٰ(e|5pҖuqRIIV: o~y@w׷mM!V.byP0 lYG}ꋶg3jٽ'`iPNc  hDP(FUջ릡]Mn^ϨTY咉^G d 0ʕ 78̣ ڠl>,nל ͅGЀ5EDҡjLyD1Ĉ@#@Q:R `DCSqhKP|km^B[͝ϸ֙ 9ކƫr ߬ ca1eF㶿 RuJ]z-Q`?~I"?: 9 !\T>%*V)aSd nNQmV\]n9zp*n_FW)j0Z9|:v9p#ԸA1!O1$T>,OF=78.`hi#/΀n iWBAՈl^q02"Xx[c21(HutqPDsX+X%yTI4 I,x9Tw|&ZAmx8UWk-!J(+\wz7NmĘNx'+`wS3M5;$uf<ݥ ҍ$ˉ}L01tCR-pDn.;Y8,VB>uf<Ը j%̔UcE4:GAMArx3EB8$&Bp_d b;b&'.YbKxG.cRYmS,,U@%oh Q4r=GS0;!LjLm|4JxKhHXoKQ4ؒ 6Ed!ţ98ٲ~8cup:j!!LЊJ@‹u +h7[Zj)1d.溑~8|hS%CRDdcQWӝ Odᕈܢ:ᨻYڴ\C7S.H-EPiXP@sb1{xeUhuMi4l#+r=Ԓ C<{Tuɽ%(jSXPA7v6֕:.~Wy74WZ"R'm6\68܇ VuCeJ(f6吡6hqDRcphN;>AIۧк{;}u13fU0uE|+_"Mu~0}v;jy7#[A./5Yh7!J֒{z~OB!OWU$F3\Nvrܾ) ޠeW+p4-<0UfQhAdG B$0~Z]hy-.l;|ɇ&m/5,-ۏ j9w9zq; Rؓ*cQGE@?ٿu,nc̼ b8q?NrN渇<5Bn`/``yNda{sQ%Z>D~BjL&"tΦ4C d DĒZ>`#̏؍M>R:YMFgCS[jP-->߿JjC1ݸX) q<8-ZIktJi5RA)B rjn:޺ٱdLŽ3Ǝ!J1mLj 3Ps#H氟^&ZNĎ(nLyRu("x[ҍ΃ow,h{H|ǯ1lts_mLlD^zQ gvy)#|z=/h>js o7L.A7_Z w'/{x"Z]`0wC$K'"tw#@>Ϳ[.ˣ gIaPV,ї*Dt< !%{ e~+Kikf#*^VHn'fQ$KF&`*ڏkHjn5bQpJتFAjH#P–=AIAMp$_P\⩴'XE8 9:\B1aF(L%cqɌCs:;jSGqdI{BC$U 4XSJ_2erm5*f 8MIM'k{Ձ%|̆;^O&c2 X?wΌLcK2 n{~Ճ=E;w߆³忲g޸">z}~^7L?Ο}Dne~T0@:Żh=owv^rq[S yM4MBJ_'э^n4wT:\P@$nmtKhVG yM6!&uZBԶb~e^X%߈>|(k۰h7owvbtKDv~}kFlIZz .5 ό?sn7+8@5ؑtDb#A6hGkh9+8) c}~]5`CEA**ƹ,li$. iqѦ{76D=#kq[yPӱI@Y:0P7zi@V=Dk "*dY3 ǪZ첝RY*; Q8ȪD -5 Q8tY=tZ^ t)GW!OituZ=tZ7q:BHHTGU٥HZ)Sj"YP:TEBddKkIQ}זB"$[opl+QA!n,[k9+u)QCU ->l?,jғzi*V/a1B(F{kʘ_[,N3/b &:ӎ CTheI@؄e/n6Rd@x^ ٱaBFNk@NW/NH j.|$/\Ĭg/ y:DH'D񱓅| #kZz9<ϙ#b}p$wGՉƢ(*J~4w?[_9ؾ'A'Eu=yϕ!(b/%כ<-ɨ_VY)@岤Jmj`NC47O\KDžd%7Y~G;iuˌ|mNI&s]A?֝-߬<- yxY3_m#/.Fur!V2r?jYb(L@D0Dv*' >)ͥ$eq0FM 1fUZs EZ 3` D %!D!(@P@̀Tb ;EeBkeiRyVNLؿ}%2N+ĭN"cB3퐅 Y,v~D~`ʘw< oRB2ŲLQ2 "$E3JC{c$'U ?R o^DrUkQꘈtHER(aW"ɜl$oNlm -hk >Ql!oKde 63`s΄4 c@I_o@FR8Q~Q"-mU ):}]}q$൹ΟL]9+.8uB(Bg?;'J`g %EL\gP5g~7w.> wjSiH(E=cZywg ٟ˯m DŽMZkLUx! S:(Im腔+)z\+x /aDj`z>)Du#e n[L!xa8R=t*J;*~q,TL`!٨6_zK"&WCPH݃8x swLK0%ZL1f4)#9!Ȅޅ0y0ns"nCz;z\zf5OE :?qs ?h yذ,/4~OC<*}ӻǏwמGoܫe:L-B@>c\B᷅57>ioB"و8G]_ߩb3[/{7|s٫ٽߓ~iϪu?_0ކ6 vY_DNFaeE:SS9ReX@CuB8"a-5̅/cvp3Nr͇l& Eo!y6+tZ{7669~? YOd88'=*_Wl9zFIR=N>+/ao=QG3_|6w_R,e=6\F.Aō*6#Xk6$C=^Ah?$,ݹCƈFRJ]\vw_!IV+? )ڄP~2ѣwlYżp! +WT< :oյ ~Iv l5 {WͶ~Ϳ~r7Sr.Ĭy>?!XNsQ'DYdl .}g2.ΌL31_fs];Ḭzx" Cކ³Wl#qnemRz+RnG"G?>h=nKk*gWO yM4M{V:R1i:2uw][oF+_`#/շ,d݇ M[8dH$9栗-J즚2@cQꯪ<-ݢ'ݚ7.Q2Ud;ƈv DtbǨݎwdy2̦K21rNl ߒh/t@dg(bV]f]O4h.H#oG Rijl& 71Q g- ]APJ9lg'54Kfgpfg'*i9̣x D좝yso@ ]';NVo6؆#[}uj-z xܺnqkMFMF#hL4)fоҿQ.()v&1:m.VD@H0En3\!dLEH&R0ĂIĊ,b9ǽʀp;Ny 0m~A4!D ಺.WZ2e*3SFiv&/dc\~7zEDHtv'f˪7Y4 t^" e+) s`8ꦭKVe=9I GGWȠ0~K?=S^Zd~Yzmko^ml<"橄Lqmgќ f:5J%eQ8d厳44~> .#}7\؁ Vy@`+߬n;gnE%(0 PO^JkѮq/nnWzwy"4"C_[_.اfn'\݉sσi>LW Z5"=e 7d/t)GsaJ  j*d.j@fW(nE9}JW}\ ĢgXO94xTReVQ*dt*n$3Cd_̟-2Ζ *k Ox>,GJI>[Uz_^t;4*Y_Ti 7:gN#bM%u2r2N2>y'k{8I;h}<'Ϋj戌u(FGw aK2#60h[ܬ پu!;nSl6ϋs۠ ʑ=2cTjEeΓ\ Hd&,:|#h `RrM\šT"HQ+$ʶlQٵhҕq BNP⹨1 5~W?zę4f&P12\1KfvU3E d Zv'}i6A<~ʗ*B>\Ȕ:S&R>L*ѥ-CVr (G Or-_E h=ېXiCi{ ^{d @)j2tC|ߣ.:ok#n_W1q-7@֤\z]<TvELNK{lc$+p6{1kL HcnZrDxU}"+ f;&ܠ?836@2:q9Oż8ٰ5g5ѷorjdD>EP& 7p|N'CHBC H"U2r|*qR&pfqmz8Z`k'#&:V~U(h cA~hpbW15'+5K_kwt5>-UE!\ՄJơ{0huw𾻋/W<8ྡྷ@qYι2  Owj2/QSyLн |s= mC@ ж W;Ovϵde`ţsY)} A+ݞ7y\x UDG3VgIq*&1ܨE$/kT3`oBey\k2+ 7UW%>:boV]-X)cCo!`Qzw A|FCR&am A|nxZdM xKe5e;t~q07Vt$<yB)Y1,E:/MHWf*1=|qZ*ߎFvѱYI"]>=N%w;T/T3MtݣroQ(kY>)MQF~JnYy+blYcjsJw`aO-\Bw|ERvo~gPtW{=#kdaA,x) +8!_[%a] |@IH'à&SV 'bUh!Vob`}VWO27o;}˖}uxׯF [~Vc]y6E>/zyZ*NJcZ<\j9<.[-'.&.뚕Z I(*ň5bWRn[,!;FvX"3lݚ7.Q28nv DtbǨݎEyZj>նvnMH(Rnzb11h]!Oδ[|HvkBB޸FT\!|FJ0LZn&luMr%pةe!՜}J६ Cg?'4'p 'FW)Ap~B+''FWOOeO8 MOOp\ O~B+ ''FW(qz~oy~B+(YTWfXv_W64S.ZB d^Z@t}VY*o,j`}v_[QRs g sMa>c; Ra3 LRۦdlj`i)=kĆRؚm9P6%#1AЌHMdIJ)YT4JMuRC*,aLRKIвԄe)(*9Pxs|[&+ u^eH;SLJtp[p 5dvy7N/WC_zW{{[iH20'&*X! YR2i#jUb$[gG ͬIRS@K)4ɵ"XLA" AU&e8F!GA тj8)_awVjpG2"ͭb&=9T&\ +HzL]~bCK!d&&dao ~ w7ۑTNj(䃧AăW*Sa*O7{ۅ|䧇=~z 0=9ޗ{!-"]4c>b^v7ysW /=crKej6£]20mR}jlY% @j hߎ?L&# >=ތ'VMgH(IAPbeOwj2^b3*rA Ӛ݁YACNjYh Mnknz22ӥg1 &>m1>8_ S_9{!g ULh.VQPQQBheN;KwkR,ərh }U4NQM̲f2;4xS=*PEr}u`<_fǙoD3i&shYkA\ aRYXn T4~mٌO5/ܖ_( -{z@=3f~PwP|崹.&h'i9hNL; JeSeM>Y T ۰^f`As1es@c$hNF +eÄ;1s\JRMωg`NC^]:D.hfNq 3՜O8^w@eRgqS3"F{]{:0QY_ᄠ}?!)u_u=i9M_^(-Bwʛ-oWTH|5%נS+\*X&%BwemI i|dcg ˞y뒱&A%y7 n !8ʫ2B\9CmupC!F L=yS@qpRY9J0^D<7UT:K|P *ת'zQ\ת>PX.͖?BΖ?oN$0Rʄқ@J4*: ǒr!.-"VpSP#!_n(IDv >McՈȵOߝv Ȟzr0  L?~VyGMb`& 7Y.X>p` fFLܚ1=\Տj1/U Ǫf勵8`B>M^B8N+F!m̟FaC8p+Jw')MoPGm(ieM0  Ё&K#(6>¯Wє&ǶEjM |0P#hZ^JH滯L\{tjPE 9܁"ƅ L 3%ri|خUuZ͎ISҝ^^V@=Fu^jJ~s46vüZH.9bѼښ5ʰΒ}tkP}$&j'N Jm;7n!L: Fs=Pf1\^#hc1rQ{ F8Q%4U6,R}vAlIu|ӝ&4F9ܿMZLAzm2$JM5_#{RԈ GB΍HPWJ((eceTft7hwB:m!IEhAdRFz%N3Wa'Z..r[PXׅn8޶G;F0&ǜ9#LE rB2͍τE%3NdAP D9'{BO{}\`^;IX$3fԁ(β{1QQDq2BŁ0bܹ8DLM=,YqF:v7Jez zf1w:h$pabkAՂ$[?Z%dyb|&~7]AѤ_WB8]t0N]q?=JfO1C!S+:`$gy/ڡF~h6N =Fз|csуW2F >&NK7+xYR8l`IW=tÉ`8 YGJ":B_q_t!$¼ ܾ`/1z)`F {kۄP{ޕ:xOQתcz{s!iaѾ3^hTHG3e*]_ə[C]Np>HWkQ68Ɣ㮨s%3P`h}hBL>H];إ#zx뮫ʏY5daO;8`M8ث!'Z~G&7 b֕7"4nDs!Ԏv~|f}$a:PgQ蔯/1km钹~|Ifɟo_?1R~n?_N]x5agQN7ػH .R`` fz4/%TxkYBSx)ɨNI+|T;sr.JN3ۛ8yw{OFMl~/n Amp9byPԷr)(k#.65_DV'Ơ>nmkq܁ ř̕b_tj"9P^i 5hvfy!G C$aBD-%LCygw!ME Zbw_mNezN^}ϓߜI d2rRu2󓿞7y+gyҫg'yb3}sI% ˷~^Fϊ#B?}s2ӛn//|f˟ !g˟w.l:wv6'hCNcPh{/k|pRdC>XvQh3nd|xS"@S2`GI~|d} QP^{rg5+9eLv_q"5HGM/QӋQb%ϝkʉd Z)Of\{OIb"sFuTJ7jU;K%w}bRv ls(ՖO\&)jH^~5w7c sCrI-%3$[JuN@Zȩ2"Xh; @)g`^Isc"8w)O^H@<:8D  utA,0G(#R( c0#TGC[1iв0<#VSAĨQFYNydΣs0ВR bq 2 n -iXJ =ڈen!Owz.ZZ'H_S'P"| ;g &HMs AY" eڇsɡ;yiUu {*@oVctO UW'h' T9h%u$@Xg\zVIuH!zKrQDTTBIA }P+éDO*UO_#/GO[C]&:,>?Aix1=?iuP\BilJy(S`0lf!$gt-6)to3"sJfi|YLfg_C[>9'm Ɛp!GέW{w8k4臚KxIh.*{cBn%(oA!\S8JoR2y-^PkX*O mug"+)x}ɲjBJ?\҈?W; I%.K "- 15 S%!a*Knaa$}Rd=C0F7D` oE TI50 H5v$oJ(`LJ9K\,UHΕOLsoJbf06xI'hD1C㩸z P"( AC1ChVrQ"m07ݮD1B wqK J*7&ԿXQP5jFCh` ӦoEfR #[jkrAc|ED!j!{41F+Z"h # LДHqti h24X&F{X-w4H固}z=˥Wr5S2C =kAj1 tK_z*LgWcF^oK!0-pLmF 趦u_ƈ4z3!=)gl d ӟB~m{ p7۽܊?Ktg͌ ztlyۻN]~V/U<_'/7/oc趁&%d߇Ul2KEvf .KxtEPַdv+EӃn9 * b9:V䣃3縸BcTy=0CsˠJZ[}xĄ֘՗A;!(К(tb<6[ ,ҺtK#)2&P-@~vTObMВBZfh:(>byù_WHE?Oi+!2isAoD֌&TMkC<~|?cH<9 /Yoe x{Q<0"L,ЭIUSmS%Ҿ` W=$oڿyk}>~]V -rCdZu0]#j}Lrf9-,Lې,@ކ"=P[= $54kDV2c7- %SN@jd]a$GMeN[K &瓿_/M+qOfw=S 3߿Y-Ֆ/NJ@gEz)|Oz5e>^+3q}<ȀQ*Pn?YIO|#+W`I+62%}$DAq[[ N{4n{ ;kj.$䕋hbZjhv˕ t1s c!i!Sx Gz%;MlVo*M!d**lrBY7 l: %u.V=G#Z2r#Ukuh`(Wc]ikŝt> Y*q 0_'|q9R咨!SJqy&3=}OT#_9l|xvr0b4_ ՗>:[Zט*ؒS=/[Qs ndƃ6[vO`qvp}yrk?[rFyy{*/Aku.!,#k*05 ;yE~8^,cŀ3W]&@#OX}|XK# X-NRS Jy#_9p0vb D 6:v%5SK/P,E0a&hRE؈?g=<دVQ h<'1Q.wY<ڌ6YP_Zbolب0#жy1 !o Z|$vT=t3OM<cU@TTk\.2yEF 'k(Ai "cU  ȣyJ6ZX^+%"j)4[0K0IB2ĉ[MwBө8Pɱ::bqlW!TR|"BwA j+*kã=b9lJh@F +z•U*Vc: ؎Ӫq|'O;È%v(ECV9Ep)`!+$rn \hL 6:R L.VRX ƀA~-n0QFCIvc#PXhûiR^P9(\tTxpUUƇ ;nJQR6z<^ղu OFSjqTW*q;Ʃs.-XgŲ}s1ON,G/GU[J5W\8s Tr6Sѯ %ɰlY+}`!º^uV.vUP(ɮYOzF/Q;=ic*'IpW4*&^)޺VQ cZׯ< k&ϧ2|U51k$Txk38 k04w 8$ߛMJ8pkBBZ]B-5-{w^sJi6dF=F']pN@}ntC:9ȴeu9^_9jXulc;G!l:GK^%5NuD,hQ(ͼ@7jA9k.W6.%Av ^6gABlڄ@mOugL3)zƧԃj9׶?xxY$̼y'BVА/k˭ќ[D@=\olowk@m(1jFA· ,f#K%_4-De9HSV{tY {|6~g!\4ʊ99TgCwY!+3`E͌:p@sys0a>oFESX5E&nF/TJB>y2Ry?P#Hz89Wv*jG+{[C/1r7W|#Y#U3x*U%#7Ǵ?7y~\RzVGXDỲِ|"#Srfs!t"íf`=.uRpwA4pv.]*/w;02r>彐2)WdLz-5cSB>y3#EQg;LXsٚ=5 EtRHqp֬-)U;k—8f1<ҏJ`XCAeC0k03+'R/]%&>å\'x"d5.'q@1rԽ38Sdͽ(.۩ŜD8x֯< PC7ZCfw"I֥_y>duC2PqM 1B;$6*9kcF%%Yո1v*`?ލGaq, |fTZw3J$k^yCUBai4b<4rK9w2~h>,݅v}buu ;f?/|8FmFm+z'i?z G40S?` ]]Qcė>4/42Ե6zin6zin^Hd:?N"tcE?k{_ kIZ080p%kqanWA_4uA-<~6GgQ0ZVT7W8k:K$!R&ުL)DS/HalyPaCbX4'ND[QbCU1)VK2DZELJWN"B2,"0HhO308/l`bkBs*9St29N7O|6216D%K3q #:@jf, r@҇JbH}4gE8s~aD^R!@`G + 1]Ѕ6ASHcS f8̸y^PeᙗZ’/9rZqԅsJ(*Ǿ9cK ޒU4"ލ1d6+?Qܨ Yܽdf7Qq&*$ͪ,re"iL.CJeN3'Jw,N s2ke4x:L 5 hk1HLN8T]7R %h\= eh΍<$(%<]Pi9C>p!$ ! n ZPlf(_P lN=%NrZqy?iFS|' hf3b Dd< gTirmRpa{͆ sKA6(^(G)N<7TŘk(pHvWh])')wxI|;*iwׂsbmNI`u#D֐O>-,溜/C7|eRfb< uĞx>bGX&;AV~JvJP꯲חr}= 1!W pLp%=N77Kz,[wd4UңGĭhyT+ۗkZ \㵘p Mŭwɻva\E}=zMss% )KfRIYwy=y\DdAM~hL D'Z팆Tuݲ=ڭ E4K48ocuuor&(,v(Z)u& 3"ej!B:>LcN섄/-Pkt5tˍ*:D7Tmd(u>*85 " ^xASn 6HUGON#Y & 4y'-mUC9yFA't?efF&sa<jqY^6dBk8/T%AިFl 2 ژ!# zH3j[[T$S^v8L fH/݋}?ǥC>g ObŝS} ?_\CewĊaV>WZ2ϯdvӤ8;KћLk@f-tUpo&׭E0F1ΜL1`sJnvrSvk5E/Dj5%`" :,I8 IrD4xFY`^OH3? u+b5& ո>1h jXE>j&g!(0m`N* 96duO51ma)wYXT# "(I@j`W-D̍bJPs)+c|Y6:뺮Wt~qLj=pK]<ŠY9C>oIIBo_UW_\ "^lXKy{a7gfn, }$pm\r{~zpϘjNi(cj,/]祠JԘz>1wOFⒸ$}+7hP. m\HoBzBZ0F((ZHD4VfBBjJhq.q<J^Ơo܎ Vwf\{_m8rٝVON Ny)u։נ]x)vEq1*Eޝr4Q03-ȘaaU1iޱ^`U+ S yB`$<$Q&`L^`z20x=`[Y^@׺A K3`Fw$}Ͱ˜PY^㞗ž#o 2&/ ]# jh'W1G ĽDMBr\M_́w'.ׇxI=XyWfL< <8!<I90N'7y\=SL(qq7:X87M1z~)(,j?Sn,PK>.*{j=Q>+R-`5OqDjE d>p$ 7/]\[i\u?å=8b oQ5iay֌xf)˴1'w5G.ē;ƅ$:Xp4b6/1oҤ X0Fc)xc&dCЋSL[Fbq/κGlFP_#v<ݯ$$t p{~%&Xy9b{[\Vʪ~\E@6;UcVT'ZzW=(kSH׋4f͔z3q3Lќj^i%>1ݘ#LRjI)k:m͔Gk˃hdH l;C wYܜVh "=s^; Kl~a{6Z(<wXv)b6&z@6rҫ@M 72  NpIf1hq6[dL^"ђ;L^B͂lz<4s&' jkKʨD= +e0u1 ؒS>1@x=B:5@ 3o(`g?Z iV j4Z1FtYb)dڗ;2\s6)O dOOѭQt]OVC =+-} \F2j 4<ĕvQOi垑{@mOfhϚ:Âގ3zF:r g3#cѺqѭd3X߮+5I@!V>qz{d&YNL tlci9epw5cԸu󸘅9YdZ; m :%H@;ﴔz;t;$!a=@ 5 %76Ty l;icI;f!q@Me=BUC`[ FJy:'ON8Qa$^Bt`i|Z}AmYw(n`ڢ.jYG}1lg>'(& _E +2RlFC-o״.Вݚ@+_fЭ+ɞ-nf%V賢Wl%C9{=Pٌk(R[(M_U if^X4-M;@ٚQ[d- %\+bdbTB6,uzӌ$zhH2^'[ͳ.|v64pivl@N\=CStTf=>>[%f~oL ].2dVD5V1s.8Z P>$ʉ Yʳ.9:mi*O=b5:IkmXFVQsޠv:xg4U$P`tO[+ )tw"tмwK|W+p^ D$p.& X"@b[ 1+595X9+ 4'G%I); B@*-# 's#ӈsxR9 Y D"8l ^FzS Vu0xFQd,ĤQ ;cu]4Ђ蟪/hSHװ]1=9HD7^;.b aGoq92>|m zkd+"Ů, B >, Ehsbm=GCF]> s!V:h"fd Wćܵ);@Ix20̆,U xj$.ETRh|TH; )0JjDubyc/c%jxs޻R,7Dz+C> NA{Ќ/#QKuC%fh!|j-Pݱ6yvJ#,QQ"uZK`,t-}G 2 ̣ 5Q3'>1T757wf[YTaTH)LU*Y5+Śÿc.T9گrzcH|=+B u)*ώO )ƕGB[4zz_CFLb_~(*KYrpxPT(4ꥆKbvr^rIvލ+?͇pvځYk+FHS[9o2LGAs/25dҒTP|b0W0Q*U@zy}E¸JIbvy͓3;zg]N3 q֊ Jן/on>]?9L7盲{oIۜ[|[ʕ>o sPg ceMsw뇌;/kQ'?x+_I+c;\)lvUlVc~U@cːY:\}jMMnS]Lo{czO6i] ϟn7 ih&}>niu.҉{nZ/2Nn;Fܺ#nC~+XpGyIl[z[OW[,\[AG}ntzH|cyT~_.`WFxz,j}X}:e>ˏ0;yMucXAf'bpۥ #%ūS[©<X\K-?΋ EWՇnOr#6>+!B`*VZ ֯~ %+R@o%OFT0~9s+G*[->ؐ7/SB]_|" Zj[{-;4J*f"C{iF FJpzWD{n R^{^+]\_~ ǝO7˟|3]~(k!?ﺐ:J.sV 3NBҬsYɭ9RV}u9>Vvj}wSSBE竢o=n|w!=_^?IPRsqmYh@Y־s5- Y2`2͚@Z%arLtƠc0ޡlL%JA;QArާɒq]8Z-D .,K͸sY(ay sKfx*ӯi1ntU jlkr>h‹#d8OBLy R˜(@ ţdVfq-gjTa~NKwCS@JR[3 KC4zͻoyo$5 2ַvT7c*d9rJ& h m|Đy9єzH &z8*ob ,#(+:{fФH O1CY4VJ9^cҐ\jVKDfVʙ0+5#oe"Z0L6V4N Y(xJǠ mDY[<.0HL-9.:)`I-aFhj1J˿CF[ij2˿<*F) [x6yUJJa9'r?^,r<[o#3Jk8č5J5c˞yl5꽶潨ڃ-]z K;?[QaEqk%H煍;RuBgC2O(˹#sr(RUrA@)?c3T[=F g@sWwV_n-?{6l_.v#@_`d}$ Exݙ /)m!%YN+iÖҩ"YEp$.5RBAB ;^hie\y PcS(2(DTgF1wV7R S5Me vQegd r$3~ K1\jY>2իZa۪wš20T(\]ԜjZ)5_Mncg*ХҚv8w~.ݲrrisY_j6[୳[go} S1U˨"Et5M2 Zԥ9&2*,X2fDp*flCQ 4c؞Gx2$Fl2c0O @ΤHMb[H`u(xG{vfed'I/bA3:. }Ȟ~{d*Go[Ԗv?&yFws639ɻ&2 -MJ[⦻uZ_K/C\7v>ü#=j4ib/x$<9j4Mb`]d;hΦYM nG8G/ș]hG_}:B6v=ž$n8gȎe\Eϵ]V>..wMһRZJ3J $=b 鏇|[ |TͼspB\~ȼH^1B[SA{ֻ߃3*tgpߨ-߼8S%d{_÷A.iא#6*9DV˙]ܹtP&Jg\]ٽ'BdM#.➞x3XIʽ|oUDd  ;ɔyN6qHH\Drfݙ} 9f\RVU%y3}S۹[ {@FRןQPa~9PE/A=TKgqP %8U6?)ÔPIdخLA$P*E&`)BvADB%Eؑ0Ie $qJ0 c wN8 B$[2=ᕤ\h DqekfZbK-*I=WԖZ@,!߻?n9icW TIvW:z/5Y{u94:?ۑqma^+gVU*F̏ඐ-7ϖ_Sk mֳ*`UbS4h.7oZF!{,'eo|S Zo)L<;ޣ s2>/'[Wf3KOKCM4ʦ0?Kgb116ngM8^Gѭ y&cSGP{D^3nNlUF3Fb-~=ptkB^F_i(AĶQet;թnt ݚnQ6% Yh5f{R9 7vD9^!b&z_.;Gq'4hUs xC0 L8sȇ8a ˋ0g^D@Wb}C0 Mb !Nh'0B./NpE8a Szyqe q'4'p' q©m_…WWT.Z|+3+>qىVpQTx^w!TC>]PP潙&ƶxsbcǛţowo UaOyư+._#^_gblgf+}NNCJU|zp7H}O0~vS{S`P.( X\"@KĂX߾ ڰHcA[b}m[2ە b~| {mi}elC,BtS}0b>LGo>|NCjMr֎EWNK)mcC =bz@]Q̚,䝘=|(гgp32[gϾ60%QF=_>S4 $M6̣':562cC|DH:zH>~YDm~j ~qu}'~ ^Kgf+v<ވwjK^IV%/zݓ>ֹcxR>3/Ֆ1 :ODA|}v iٽWˢTsB1ƆA8&H&vt!$~X&CnL3X!]2w>{ @CӮY9?Y{Ĥ;W }^uh5U ?@[]xr~bǰKSƄi2: C&Yp2 =oE`)xAEu-`}.Ep֖6o/j i[tWW faaB@@kC m؄z !->wS᱒43x .\6 9a80q8 ` L-Ǡa'mBR!PƤyNt"vs/^X{nN>e1ly[/b C@HI#!g KD"4$Bqh$U#4&։Qh$܉Yjn 1 pBc9>/@t9?k% &g~3MɟG_%Aэnɪx~8_Q>r:amX̪}Y<,޺a nX;2ZBSA*p*2X&D5!VL) L5яy>Xs#[)؆^VSnty/f$n|>Dيk&-En'u7Fyz揿俻ycu(T>$𨖿\'ɫUӯQBZ}U] CfeD!h`[ fU&@i"ө(eijyqVE4GE+eA|`Y [~v µe\"θ[]gᅥx,l]4BVx h]7nVG&⥒GQO2B=H^Kib+;Wa1itJ;\͑T5oŠnT[ i Qޥgݞ ]9E+ 5Ebt4uElsm{kžS$&'`҅+& hZt[}tvkؖ9fg۸")%J"!wZ& raFXs+(RI@ٶ(07'dW 8MɴP*NHJ̴̘(5aeЈ )p\,(NլmR E-CFTi\߄*TbT 4ɬ}j仉)ڔhn?JoGlF($ ʬ'ѻ=֓`5^&ylAWή G).=2bގ`|ɵ z(qZل_+0mY.Bg$e[ugZ)X9PېsBw콣;<'vW2߹'&Bs~ןp4-nD+t4fzI+ʻ_RW]cWOٝ َ6!R!e--h7LC/k6!dbD6 'uf kOw<'+.\®@ƭȌQgyVdžUe9qB_fXpd%fSSq)OF\O<;ΙC}ڴir` 9D"Btj4 K|r21jOj(tt5QVzZ ֧[4Tz htJϖh2?T6T#IוDrG$BM%WF yxpi0z5nYr|>NɬJc]Ԕif?-Rl\h|E#ˁ_#m`֜W mN =[NOby 1y0!2 1jܫZFttդ[#r6gy(]3WZ󻻹稒f7wW_X"{R5]b`*\j댆] pF(ArVSMm(%S\1ϵ@ `FIe ]_` +(hSdV(bhaXȡulHD#C6X"F*Ut|}DHlg\ceޞtx.oxE/AG~}B,A/xr*WP C~Oɏ?W:K#DדWLbWJ=󻿬2D׷WDLQZ_*cLHJͰb`U!Ppn{ -*WKnF\q~a9lZu2UIT(hPB ҧ>lToE-pSw> %qQCp)DDDIaToh,kk74a@J8-.'BI F -FR]>z:*S?J:*JL=jm- R(Qle2D2@F r +SrX;OFI@?Yg8\oWE > ]oHɵ^ZIEOn1ZcD#YR90b#/ "S5GJ8V4*agÛ'J](a62; #LvU4 sa2>?ۈLI^[ΑYȉ˸pU\D;r5[jn{ ^ܙX-bG 6à8.KEB,քa `Dnچ^xsX43#lS{ÔIA :e:|UEt 2"E[r1}$^ \eH:8hCkO5za`r)Hf-lwN8JhaqD8T ѻ$rXݦ=::m1Hݶ&)O83־Sȷ_Sw!!O\D[Tm:yGq(EfR ItZHp nm:'.G˔JT/qv}͗yXnf3w͵"ȶ[qAMڀl@.ܹ1VG 1Oٝ\vc}cp5^w(R>vc\kn5:]5}rI:xQZ4U7!77_gw{Qywoal]!x;.'=mQG T.slOvzonsT$rT\ Ⱦ9jekp}[18td̫XDj9Xl"1?|~a'aq|Z^˦LbF776VىWIV5HdЕ2kޠ>l2+$bM,vۻ_Wrɼ^50'zG +IlNaE' m@7FNEprhp6@ѭ֠*ihJyFް .2㸍U^;Tyzyz )͸eȸZn=18>Ђ|QgCh2w|{pVU\:G(fCysyC34U9U/J*fr.C?N+ .:z~? BDMW/ ħ>nn~| yux`-Qp}z z B?EjU ßLhU҂UKiJ]Ph/RvLU?MVA݅GvE0"jb M_4ݡI*L~Z&a@Y>i*7yY*xlQS2Z qWf׋jfKIG徽]d-2ݻK4C*yij@;쒬tTs's$o͜eg_[)bCÀs9,gA,k+6+xd~s W*ȌŀǺH]*qufCJCxw:Jb(A](bԉGё20 #KOL>r|j?Iu'5?uS.?,fo\q2=Rw }Yd&ć0he5}<+4m1wȈX"xj88NB"}S>9"% 0\x4hy ǃf!hI\a/t!zBZq|?6r*M/}9=pD=; 𧫷 XM_>+iW&56Ԛ>J=Pydl&zap,O~; y0\_м~\*YAYI%gkm rn ~6Aj=Oݩ֧pjP]K. ס-XR*4YZ1Z d|_EQ |wU}[U$~I& 1\\+O|\89rI??HB>eZ&#  D/ (u:ڂ}^x>,~Ix9y[T^ԂވsdUWUu?OkF_֝-4^O%?,J[b}Z ݔ5 3P.@ƭAf,J%UY1-{%i|pr Qyju`{4@RޝnX\SX]c81;)JmщKP}WSZ>x)/i FМ8.Qq΄Rc,0cmN 2$wD$e. GT xOF7HSJ>Yw(gMz}7nQ1HAE2$iI2?>%OxwK$SH{zMH|!wR5;)ԌK2Hۄ3Tګ 9 `,4 t8MnWn&,TRqr2rBטs64Ok2zCyA^pGx@`Nvx_7LaRږrDyGf( &(66V`x%( !J^HŶZ`3p΃6Ja1gsǴSB6qd(xBCi x\FI8IBqG5Eh'(bs#&"z-BGfps5:>&yO+6IE:>JRr z5n/dӭq\2됻Lf<"3*1n2[ͭ%hEf7/_ N%u[ǵ he3,@7SkdbEu/{+uHˢZ p͈ %k{G`4 V2AH q5狆7jmK\ȁ%B٨]Dq2y^E\͜Ck:D5ʜ IVZ% p/,(%{eTLb (IXF0O#lQKRٌX^vAqFXqy#`șn~Vkdrɕgs8˄B+Y8PhB9\q@TFjuN:d>R泃58Kᜀ\>hhk@+ýtHHX`3QS2i:iZI{q[gm@Bu쑔kI?G x9 oi-{Z/Ou^ !(m A{ӧY cVFƄIЄi-7& \j-]@A"PpI$c0o,1<"/XVL;<( Z9?HDTh"gX+ 'Ձ* РpW)f6|E_Mhi7OZE\"!'$B tDx,y, gm)-%{{2Z|Xp(VT9G=(.L,(#9>bgoX޹y.z{ =|^ P`-}X 9_4a,puϸ*3²rkF/{{QU$ !qaw,6ɶ%gfdw3#cZöfZ⯊UE,]|u|ܨVѐ~GTD/>Rʹ qyqmo66ytr9)p=(tL(./n7~wsJ:1$漄Ibþ]A۩bӃXvS{jZ@ -I4KIv0AH@VR pZvuSBw$W2Iv'P6~I>L' $W\ iR`x%gpUj?2rx%&Z!v }Y,1eȍd*o)\ bPJr.%9Oo2ҡ6YXWIT n EcPEڰԸ,<ʘL빪J>(՜+ jez/ 074(8N[CA+Lj_3BC7qab&&F@9(&rhV9^YgD`dt+^3ǽR2{mY(+dH9%0UyN>( t@ +\V~P 'qf%33r^ZmHKa1,žH3҈>sO]?!/ߖb]5an"s,u /7h:Ӫ>0iKz񉤡raT).B%cD@sGV]Yz_X TW^J{^V;z|ɀ)#/u(2Ջ~sdσ!p@G5ꗪl"̂U_5."r*!V/]=Dj)GjA lɨbCZ?דiVB.mζ9}iWx(5{|?2 ̔p>-z kD,1eȅ0Ӌ͔foH+9YnMPx]1j~(]ڰGBhOW!Gga2 mQhƊlִu_sK3zxw<`!56l"dl R/([2&#j:uE~ރA  sF= TJI c'6+&I`W g`I il|$|$(..jU=rXIdi@kUґRU *a-čS &T]J]xyV9aY"XqJsJo 8p*TJӒ3k4ORA9͟h`g.de& cIqW'BK* y-JpĽhYҖW`=cNRGMFHi"aGݗ:3յҤ,-W(Jn#)&d"!6@<{. Vp8{.~ˋ՟ ޮ`#D..#K{wRqqk"5 P /OOm"lKrIVd>:<*ג+RKE^JAv"7v:՜sVMW elf2Ӥ"ՂA-@OL] nyXgynW1pܠ]T6n4 gs;O>e̽d^u#T#{Wm4x+q]KZ!,k\IBp$SÉZ}5MAu Gtv; ZuhvCBp=]$WH=UL?{di5Icl֬cZgKt&Ɩt⍩mƛ>fZW`22rR o|HڂZUm֤t>MZ N[ 1C31YhNߜ0S}:[v,fNFgPr(SIП($8R%W[y66L&jԄ`X_qqrձ^{'/\*S)…W<c !^-1Ѝe s Sך ۍ]JS!4R/jW76Q$ޤ}'e+eB7;[\_cQ2Tq%.)JǤT~LJǤT~+n~u[HR J22@xEt \$ڒWqi/oٿ^OQ?/z{;SI1nGvc|fJ.S̳K1 "-7J)47j%RJr>`UAYYD+ fN˱9u.Ihϴ|H mho#'oJLwScwČm] p<4((Yv)e EGV6+ѫ"P0NQA`J+"ʠJ U;.Pj[ k2FA#j㢺*Jt)NʔEeE Jw  6hHpαQ#?B#5RHi΢#-3AIG}bW#LyѳZpkE7 X2e G_D]-,3I$)tnʻ^/?#43Z$AЂ\IJ+RߥruV&5>üڔg8tJ7wuq7]ԅiBWW\͒C^6'o~󾰿ګUE_26It/Kqs z!fUd4)֓XXPLj1k^Uꦚ7ny[<8_"}`b7}Uߒ u`60-EW(?rVI|. ggnD,WfPb?ThM ޢxK,U ?: { gA qڐY_~$l k*@+* qV(Y1,J%ʢЬEn7:p$AsmFJ1{iqT=TmZܺ.xk2cesPmxU|ǟ1^K^0,OOJ]; } |.'MήM})E-SԼ"JI,1(NJw amssx`8baŴWE,}!lQ׻;27K`QE9MVkNCq|Ua<G넟\tU_%[׵}NO^@k>(54Lʫn &gT. 7-EnL-KCX0H*֠c\rJk H20eVa:eɖ<6VO|Dzۤ,/tE_]0%{D^ǁJ ^W)|@e܇ D)㗕¢t,&g=3ò8ܺ{0bGSiuIYu 2C\i)1JlH2AʽM AVRj'ެZA R]2j: {!$&9  e PF>buX,bɕUe=Z oU{v H?HZJwʈt&w_/5j?N<:z߁Q@R^ů %jo<9#8\A_A3bAz; nR)gC08}+цxhZcAɨJPy'@dzM)J1X3VH_I[R1#eT!jȴ𥉉 -l/>D.kN7 -u<|Z,_Uq͛Άy=V|'vPϴBΘi%d$yϟZr3X[ڟujqg8K"xeG™ &ߡS(N1mMDڤJd=ܔ*#f|<= dx /k섗ӣs0=%2$hbo7rѹK+RJ&y K;W풔${  (` r\M >햢z}CzÅ]7WSi-_=f:Qm%ِfNĿap~943AxTt9jVVs3K^9"i*<9ؔC|wn119-uzw7q.n:}G-$[R͎P;:v:G~(1`UY{kW!qI(<'=lNॊJ %;e^S՘AV0 ߅^.;/F<"N$;D&Q _M}NHsnj{Y'ޟTZ|OûsR S!̈́gOwk?]˓y9yE?Hu]ف뮐 1Zz5o|xwO5͛kIk_(g;`$hHx͓'yr,g5+hjQ5Ўb=9.KO7nUVsHl?v4?o47jG=5= /g@1c$kLnMV-~ybDl1.|M.p6a 71jW"+yu4Zxнp֛ۜ ͏mY_ |yD9Jː?aaCD/8mg]鍹T&`Ew끂vxGfomE0W(B̕6;STJ&ЫnJj^D4D T"9%Zp(=x ѢF +L"I/ !3ȕrA(W\!3](I:c`TZQ01F'B(`!k Gw5|x,IM{i^`j##Sf ŵsF (^9鳏YxUR!#hљ-Jn\r^U;G-%b>*ctdrޓb&^1XTr@Dxd$ 2RʨZG/u#Nrk4t,Ȼ~P+N4eNW %d2JH*0YOGp&@dyP<2dpS$cKT\9и2 =[o=IQq4T QT !FJJB)F8]TΘR: %!dt @mf®@dqܰ/pݤ-L…SܑhYBI%K#u(r#/m[Mj%Zڻf5ˤމǂkj Tvդbӓ7mr'Rإ;0oZ=[UKybNt]F,gkc`5|0M,JJRaVaIY16!aҤ/?9v4"cOrB |$hS0 wUZk)Ǯ[W$ߴK\蒿b"'rŨUPWCy{1l2:P0ƚ(B'Kr恞(\K:Z4P'zc!j!hxM|GwAPj[1hpF)$lJɀEZu}F|6[dF|n7"6&QL9{1 ¦C?h#Ncӂ8kM7@(.gNpE%]&(%YŀAI: j392Ѩ`9@nGC'a;|@Eka$pALR}$:8rfo{'$E.x `:ux-NpG.`'Gս˓5A` IyR⤖xkon nl =Ԋi9mnu`:^\:FҚ!ɅSmd}v[~{ۦ݋[Z-}MCvY;Ameo=#QkG5w/7V_yWos?h.?S>)xsI_JdxpXCU'}ft=.%ۋr^xOq׍ꂸW<[]}70K}Cf+!CGcM3FP1T,BTulu綈%< ^g;߶ۛ)ɔmXPgsj{33:L5,#+^ٻ%>?2·8$ZxJ':*=UQ/Lg^[֍^Gl9_~†:b/shYDA%渗Q 2M5<]G;5n^UG x 2NV UZ#w{N&/rCr#;0Gx+HnIK\@_1"} vFw !,:*ʮչ\MB8~hA'ˋ~d!{&O+iwOL,( l\(zm"y`4!m苏D_'?^/C_E3 p 4|[x_wWr\/_(f"u)A#흜L'4.$ZT^{,R H*b 1 p?SӼ:BZS!|}{r5}BB'4Z_rjUqOzXm#vtֆcc>wuA}igp[V1m>krm;xZ>^N:SYUҳJs[.(b񝉻$cVwmă5WGŪJjy+ۭ{MoǾ؋ZoU”n Ϡ(kce.E €k1_eҧ뱓!BP>֧UJ{mR㒹RV] kc 1Rzz(:,\>R@?|jejV`th ǣqҚv^uW|c]:^ٲ@1#pg:}zR-i)#O_Ӈ~y<.I'> (a Wx_Z_}w[w=ົ]rHu)Y*.#Aa~T/hKgE-l9NEQxD%g9<{ ¬_ 'G&BLr tLv~uw,~ [`(۹sy}׉)rU:/IM_^- tģiE$%|4U2+,GYGQ$S>l"H+;EV_1MÎ \󚲫`4>ߤ}S(I%4 * !j8$>F.:$"AR9C'S .a烟G FrQ йfX~pbHqBKt(Ps9K\a2r Z!9G]NW;:zڕͯy  R\=ީf3)\gllO բ6-GhwepŨCDWӷg˳Y< .:‘sǎg|P<9OIeI;21_\^zs* 3GF *Iqe3[<|#R/? Nq%𘰻_ba?Vu@(^LzJQɷy(2tU"Dj,|K3_ހY 3>H\ʘo p8|P丁 N E9z ̜Q,JEQlEcA!;3SuTYJ) J/쁦uͅo>K{;U&M֚Q4p,h2xyNs=n#&a6M}\͒=y,93>0([XIubkIpnEH 1-B .Ft5g1B ' XNI`9BQ/O\i<қC14Jcì:kNnK3&8sS&2EC`SfAk}c_+8cb1l4)$g {YXT` F3hn$]0)9u&[,E99@>P9 aNm<}͡Y;nLJULj!_KiYݽKH QhV =E(:^gM~86Ow; 9~Z0(Ԡ Q%>(t#Pjg;caGAd\2;+|0*3 =q ,}zA]|1䍬=2ЊDm˷iin +#hHdL T^qíX>9}8MY!#IQy{!G>t 'hPXLŹ'=_BǢK,$F2b$Ԋ =BTjD|߽߾AϙgI3tH]T灌O8 h ?6UŊ}/<,883`AؤKtCuغ#G_vgeՅUE  I^6mjI;˒d[vl:Jԙt ,c]eGqQqsdʎTʉ6.cdDS'g%1)WWVN "]OKTL2'WP-+8;}YԤtx^|&LRԁ. Z77m)GQ0>b@7#Ddh6!HAws'N͉:u\M0ȑs"XcL+=zt_ttN8,Q1S*$8b3>?H7KȞ҃TDENLZv60,!/Ց(٬?d-;N>˱&u*sXxAle-S|q[BMS9?˟",F& \r3-d 1Zū~"dׇCx9W4[/9J-(U]5w¤ٴN=ݫ}ѶrhocHNg7]%K$yKsڢomqͩryc.xƹe2`? Y$Z\u7 T:Tff6v6ՠdڡ%% _If.}Zb1BJb^X8@Ő0UWeb-0nh<ILQp\bߦ/Y_D+GKǤէ & }I6ٳꀵ[w@!J2YtZjCZ`VqrTd G틋Cl3e?U$-' ]dΕ}ɭwVBI8"M>76oHbݹL]wTmYv>4v34yĤ,ڝ1Q>O/gw>\Lg>/:B 1lAXd]ZUc4pيO<{&v^7(7EzFy?-7v/:R0flC7a> SJhk9K@] 3@v;2B ex1ɡi=c&evEjc)وKvng=[)Yw$a73yJ LLzVUKŐ-bQS~t畺{Ծ8TZ_gbm3t6?63KIk+ J FJr> 3_󇮆ـ & ļf*-PF?{?>rX8 𻡴NѵI:)]*Zv[A#ky<7Y) khRے X TJM"̞q~Ԩ%:UrpS4XXIEFrRb".@Y(5}FYq^ZpUy$]rgNO' yѼ,ON+&dEU ,l KZ!a$됽~"#e.+peE׏88ž-itltzYS9v\Xk(.+5b[RQ;H~Qk/7"O;gMm"2o_mu$ȅ ѯ~ښzTUi?v~Ty/| u8;HȶX!eڤmksL߮k Cx#I7r%<ϝaR^]M HV N; 5ܲ$İ7L5HN݇yHt\g"ƬE _^U @o: eQI@}vmVbX]>>g:TM#jkvOŀ*eΙGO=/NRAppV.mP,ۍ3-Z-LTV%Vqt[U?{ʫg#b9-2W_c;"h8`񔡦FVaTGlif ^=}Op+72Q](si:Xt<_[)5 ,A6StA5p#&/c"{Z" \esdF]둡Ⱦ]V#[]LT@Fw<ϝ* aZmv7'˫Y*}eS_o5xRga"[q`Nۚ&t(h>;;{}F4l J-Ǣ|u2k<跸kSG}{7%N%w}tw?|_'a>Mڢ%<ϛ0ǽ: nMml0m{Ύ׃jn-87cyXӫXwbku}I3:8YǴ~3z` eXf<ӳ \Ͳ̷aidH4 Z˃zYhl`f^7߽~{GhZQbkEY./..'֖}o 3OMjtH3-熵 sf]^'$?LOW,At}<ġ䏠WNz쟽^;Y)YE/JzXxP9mx'zsMThbHs=TxU5QE#+JT6E=1=hxaCBNi`9u+BjByKo}:- xێVV>.~g |C|lGj=ݯb5*VJ??-nғ@1 Ef GɳOYi&~!Tq(~ZJaOu-q%~'xn'ߝ_'yB*odm5g|!]s%պM&-ʾ LLzgp沛7J[t k@]x _ ƲS;Ud<ں\V3ީ,,_%M}?~̩8ɶpr5y 6l~'y(![6t 5k4H 4rx;^mo?2/+h'qGms}^_=WAN @0!um1[ }κX-+thKjH#!'!0x .ϵ~`uxOetѝ߼lϜρ  5|DlC[s.mEWpy]B&lZ%p;GF;/-on{pp+g7w5W"={<(,Pu >$y88]Nidzs"qv=םMdv9ԱpF士&*mPɺoloKQ/<6nE=׏c Zs bxȎ! B]_ЭNϱ/wBuzSSI"H0 0²瘟G؎_įO4h9>߮||X9־5V-=x: ^4RvϬG 8usS<U=dx#*!H`evluމ]yUaȶF^>-3X,%%U[-?3]KgH1J3GbϤϔfbaW8vmeZ #ն(GMԔD}WV3{:XȻ/YimQη@O\*k+g/Ŷ$ ^OoEk`gb2ҝ^PZފʾ̼_.KΞ˨zT7DŸ_uȞO y:}#;6(S֥fl<_̶ 0(Oו|gZgS;R"?{ȑkj )WfQMvt'~zg03勇 qll;S}2 Q$:_ћ#GLj޽>37/XKNgҁ b e wFdG('PҶ ہ cD,]Is1]8O:n ;WI|idӗoJ_-`UΜ9d-`~Y0^>DE ߆~ֳZ`7鲲 h(Җ rs}9bQC&<3a,(S-ް[Ck Z`Wk0W&%^d1c7lkⲴ=ηY>nQ]_7kbhO%TeϩQ}tj vRZ E]|!l{95n-bl߹4բ+fޗD"iY6PLrdX6brFxm / 3+F ] l5߯!K>e2^H'TXh svWC[mK-¯j}Tݤ$7 κOHEf߫ l'8=XyYyBblfBtzFijιt 8~L59,ᕍšNEI RV**`d;bBraY'2}",u֓KS*zΈ(r"]j9~:DmKJ~X1rA jxy0 iN 2j4k)G%U"X_ZOL#)Q9,^gdi0}R9-`YPi,ya kPm@Gn:\)bJD.PnuYf"Q'E&Lר $j,60ն~~>ؽ6W-A[ O/A"+&Ucnrdc%8I@$]N/1FH_kIY ZqY'nˢ*ɠDŽeɨdUP8F 2:WE o wFF2y%@ƹr&4($DAa^˅1j޸r!^-^Xr4_2ԹǬ`qFS,vbeܼ̺v/Re%+?IYb1W'6p*IdĹ7pv &m]]P? lfZʫթ(RiiEzZ*x3-[245G"8(tNgW{t$`|2MzY:,ce"T`JN0˰(ZQB6wi 1My2!̔({h)zJg@0r*1aLʷ-TtvݖbT RUParN I\\߸a&㭂%) LřV9ڕ@%IK#&_ii<%RX1mƭeZ*_iIXx%_+@VB+.d9k)ٱwX]C2e_,1gy۔X"kڱW*c\sLBQK7_wb6RoqO$kepP^6M,:djr`#-74ڔ]Zz)iSeH&A$xj@p!@RJ7~:NS ު2b3,t`u٤` 5s952CR,$(kxuGr`@ç͠ o@Uۏ{,{sJ|6P`BF_R6T8G/}6@3 7&]"^c9"$]VbbhψiqF1r5F9vψaxgDY"#댸{>Jez/x,i`bJ2J{(j[ENӜDJ;9?AX&L[QJbAAJ{a . ̞͕;Έ*P]aEK*;n^;c7qoXut7_}OjF Qy#DN3  e .h0NF7)ߟx2^M- /beKxQmZHZ[x=l9tMCڋ7XC0φԋ0ZD<P&vTzJw[+݇oVnt[e+e!Z+[&] e"]- ϵAZZI|4^T<(b3w 3h.o_i]XQA?=I]{\֔Ð +@(HV1k !ږQŎQ .`I\+&V;pu[E6,Fj+j, ͨ e ʀ{|ts(/ !(L/Ƚ0 %Wc%uy-`n7ؖ,V)zPUKm.[l8eN.|V&!G6`+ze8Ւ»+,>+X? rᅳ'*eirss&%܈\ G!T #j"ǘK h٥c2r"Jt؜ @Y,BנZIqIu"敕qtJNrY:C0,=xɸS-h W5` * xv~xMg::.lܯtx,:F$L';EJ,golNMjс]Ha|[*F4(?f S\Ib1WdҔ34u9"[)z5aN(p-y-pTSNi*ʻ"xo!:nZ;/=/պ5My/$,+ O))OEvј_FF?R/ܰ<n4vUL0wFt+4ǃ(G:^Ia]!Q[ 9idP_Ym~vNN;#_9S>ud=ѽnNhR>@5:G1=ɗu[T0H[cB)B5Jff~ c7ck {<0@un,EnҴO2ւlĞWC4ϫblu1u~=z&,fe_Sc[L|NY2 pV44o/70]gin }<yl(G8^Sr &S,D۰T2&C̻aL J THl9D:dЕcu&d $AwC*QQӻBFە ?y!ژʍmQ4_?$O@zDIAPu@n\BL|MJMTJ~PrRGe5knp$x=^$띻%RIyl'O[۴J؍ t҆~]ʦwsj(35S }۸3=C}NqoR1)Җ }і OIEbmZѴ|GqhٛN;P>醳}HmM1'u`+YCK }y`jjS\vcᝋu,t4u}Tq:ʨFw:t4j9/__TR)J+b^).&Qpx97QCLSQN[~<(-?߸ Esk$\hcୡN+/3^9Ǹ@jJ#s)3o|9^hQ_@hOpիCo׻~]fZk;TvefG0:`ārh!iҁx!oEo[`$u뒮' Lڥ 4 OmY&zR3,l#x *-d9c%C1\p&F-+rg48[̂ 39}no޾y?8;uWw3:Su!=ړ)zr]}beYи!O?q8/ҪDqijS%s_D[FLJyJC{7;X9JȵtF9Q;@x$qc@%{X"c ׶^baEA&_R NՒuѭ;VC-ԧE'.&w&Pb=:e,hGT): +laG>O3wgaXvQ(&0#R-r/˔=i_;&7!Ph  :itNHҮkrŽo3[e`+ʣ>Eu}Ѿ#;6 2]ʖ=e K y)sJg[f6BVKLH(p,Kg}lRӵ>bK ΎN[v4I4NiDmi|.٘_6(>ՑO *Gs ! ୡR@qWHؘVKR7]=Ewgq$=DYhI%HKJMvkΚ3l43:M/s+r(X/-AAŞgf^WNړCv^=ޟvkVЩ#XPi#[[3WE5FQK"Aj8(@krB{N"^Trڿϗeˏ&>f cΎvw15陓qsuM֦\Ta+v4dٙ:~%FfJx)tG Q{+'4XtnJÊ, LIZ[R?_(xZw+q-%,[ݬ5֫ Ȭu TQI{(^v=8kc{l1nʒ#ע/W:ѺC+ww%`$o fkw|q=ܤ׽[i׺w Kl o^fn/lvr 0ɜ!T6U3ras7Pz+櫏.k`xHʿcP)'2Z?깺]X Ct"tr>ƚCT^R#!O<*QwKΡ&!_DRj\9RQ&ѠB9x8Y' ae :jwǤ9K/~5y楕o&9}Kj>pN F`JJ9mͳA_ ck+:R2z$g$Z*{Z<s9U/~[9le-U}\U"7wsrr~0}8auO#f/VPkc? V ر/؏P5bx&۴Wkhs<V{(<>5ƤOÑ41n8$`_H)=>7 ޠ-$'FZ*ؓ]~>ii~x:K!c*9J, h5 i ;DC׬bzpf9Ǭ*8O';1))ѳ&:A݂Yso +U{LO1Cޫ*Rf[Qko_wt7 wwnƫ;ҋ3_\p.ډ.J10_5=Q kuR|>(M%C2ϮyQRf6_p`8jKD+%F o6.PZ 4Oj$ /|,yHtbцw|F3hُza Qh+{&ݹ3Av]'p19`g7`ȐCyƫ5 Ҡ^pm'>nOv=]kk_;M I*QP. 9Z-VJ)%fRF\)2i-1ZC TO]yK鯭$~ϗf}~W8WWc&Lx'SCU V5tZi[ .'F9^1Y1YxeV*)>[/?*6?"~;s^ƗiHj t ͤXolC^mg2N_([O_N{rN - @AUR2"? LJlKI)f{:qYobփd[/qU2B4>If WK ~dtbrGS.|U%OyY0hU#]՛hZGl 撕*A = :#x1f!@ ZN&IG HxkrѰcc=c ;!1 ɒ=(eRI)פ(&a\b@ݣaۆ?f{Q$1#KH((3ڑmGD6Z%`1]#Je;@X=M>RDZSYg)"mKk"ӑiz~ 5/vR{ w( ;d!U+m!9HDe! (o7g lTv%T1`Дlm@CbEȔJ0r Bt>P;׾·|M|DĴXfRQُQ6a4)Ps+zLD3h[Q%'TsIuk3)=͞)Z,/JXi6IB#R Rғg+ͪY&?N@;!1s7T!]J5?4[7酢X2Xš [:޵=VRZȞH"DjVgBXLII"3լjV][6D =Ǫ]A'rT XaNNIH,rEn DFpɳfU^pہl7 5&9 Va[[DaRCȐK ̀GX 9?d۞b鹟t650͓ݶȹ(,mQМ?ww+Ul}OwݷwQgO+|SǨG_Tմ+,ǿx2JyguөDZO.BWX;쮵cK~[?3(dG\^Z9 $ e5vc9w#E$)cq,?oYu'В9±qAԞgFPDbR"zlX[U4$Yig@(ֳB(XT΢wGu>jϑ=:hKvߣv>kVǠ ֖8)l-߃ uqTgגC`QcrY3dXPR˳f qM+}&S(YQ #`K %KF21P Ijv ܞ:S0;^* 98SFb'^qL*j@xQM5ܰ?cx-4Y٘ t%~"5-e7fvNI#y/!^ Z6z)E$HvuvJ۟FC)u}SOL.LԊf7 d XZDr6H(o# Vlk[Pm؀L_xlA)M # 1.Y#1R|x!cظTtvn P@ "Z(<*{vꎐG#[ R|T >Ҁw%Epw ^}_-Ct .J_w&|6& yzs>[TLh1u'ݿ:>}e.snsࣣ0 Lwx+0ϛd14`RQu젣RcxTΛy$#28-G"45;("y.+ gXW̍mcrvUJw>b_ѝCDɧ?2:8>!}Mo8Ll׌3!·b󠞐ů)+')w$W#n1$@7wyE%h̑N{G,ImH\nR =G1+;1ys~6wOm MXO=nχhLv\a̵2@㖹O 9PB"չLMuJvd¦!7MmDyGہ]ȝu@v3;߻fO ۖꦇvj8H[Xߩx|jx޵8N h ,@; `$ui8ǣ6طۇFȝ VX6DQKi3~YGN̾VGT7﷥Z>^^eRWzu˿RS^D<:MיzGCM^\GL}]x:ϋՏZ,=:l\-& ˫zʸR>ʷ:ohN{ϮO듲F|NSZgF[7%dom Zw nNͺMyons4~CFhm Zw n"sںMhuc·1ENc^Sw [ZpP6ְȝWׁBf5k7qk*鱖J*ҤT-!HR;aYRg7q#nk=a 꼶SEdmoaS QY8lXy+NxZ 4 QZQeQsy9ЉEy} f~ :@X, M]MؑGL5ti}Am:T,/K~w酎S cČ[;~;`MvC .ZS)\+kr#_FZvK6XAw4;Y[i-ȝ6 i6T:i, jr޲Kxr4h)E 1NKQ0@V"?XzC^:Z!,B %13cȌ Yތ]Ԕ6Qv .>L@m8Ԇ2:r9TYY!`HQ> ~ Tx+dcaoc(2y HJ)I(>O5Z^^0x'SP,>yE'UfdtD2GƣBʑ#]I(EM҂h7{2tOќtZcmHr%Xv™Dd-;`0٨bdG< o/i숏י5Lˎ#ö6)죛Ý jh+jj'0Ͷ="G^Ix!Q1` O?k<)yeؔOGME4S,XQk\-,bĦp`j $Ų"EuΆȞ #H9]:6 Wdq]T܏Tmp*|p ,fjEl@ c`tBb.DʘB"x8P 2NF:#- 0ei1tdx8k^ -K*9:r&7hِNq ڑb*;x~&a0R(섘2XjVnn={/-^F3N]<\2P^]tyňjK T#ɐZ )e J;ޕhg@TvNF*5qs؁^t׿e~ #mjiaѢZvXmb>==Fދ54FAִ;C@7"I܌;;Z #7h}P>u-C-^ -LSEԉQ MsJ.e>2d6-l<4v$VAG> rx9A;A}7FkP;1ŕA?}@؁/'g? Դe`bc.Δua;kLs D\FrE s/CfrѩJDf[oiyR87cWD ])u*$m!mhaْ'XQJقE"lhoia[;7?} % s'\a!S1IEɐ0!Sdž6u5v-qH9@[. K͈SMq+ 6kcV1r hQi6$AE[|6?Q)DAPR.us;d !Y20KAchɃD|%V6^z*LݐuejB #Og:YLt:jUkT&UEmlф&j(F*64 QF,gT3RZ{'Fgi>by^6$A! ObD0}8u29[?SCޣʞuQ)6rBwGe DGBEGf#řQ#qu Wp_l=wJH^%奏RVTһ۾G#* ΠQugUj܅GTB*p(XuyJ6F:bi9&?_ߣjú:_j'8n7q]<Ԛ:~rS8^Nq֎Cl 'y/c9ҥypu|~i_/g"6ݍop1 Q~ɗ/uu!2I{qjP&D#CW4KtMP6 6iZ^"ط `[toCy889ϯPV7Bz:a{jf&ݠJ1;;{\l0#3n|?ۿeLdDT%Z07՛Hp[AUf?I!h|t&gF9c߈c Np֚RV YpDN(kJ2Q,9J> I_ְula]ߜy(]]ђ_[ 懿K[)exo_>qر_^c-:}?G?GbyV$zα>?S(ЋSX冞Cr3e|`Z:;jkJBJhՃ3-JވFZ?ߪqcp0@4MƵ/S"pdnưQxbKle=="HLmԮ!Yj37$KM]ØLmF-T0;^'%lϭmg + !w 7V_U=?-\:M\/z~逮7U4^[\:Z^\-_ϼ<׋wor,/oe|W)={zGXm帼=jQwW\w^p~y5E%WҳzE`8Xh7`Wk ̋}.Vθ8}Wc}WcNDn8h)E-JFb ǝ4QOP$@Rුni+:var/home/core/zuul-output/logs/kubelet.log0000644000000000000000004560777715157273573017743 0ustar rootrootMar 20 16:01:18 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 16:01:18 crc restorecon[4674]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:18 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:01:19 crc restorecon[4674]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 16:01:20 crc kubenswrapper[4675]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:01:20 crc kubenswrapper[4675]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 16:01:20 crc kubenswrapper[4675]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:01:20 crc kubenswrapper[4675]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:01:20 crc kubenswrapper[4675]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 16:01:20 crc kubenswrapper[4675]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.416892 4675 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.428691 4675 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.429517 4675 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.429595 4675 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.429647 4675 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.429700 4675 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.429785 4675 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431022 4675 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431142 4675 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431153 4675 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431163 4675 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431174 4675 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431181 4675 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431188 4675 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431196 4675 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431203 4675 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431209 4675 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431214 4675 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431218 4675 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431232 4675 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431237 4675 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431242 4675 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431248 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431253 4675 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431258 4675 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431262 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431266 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431271 4675 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431275 4675 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431279 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431288 4675 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431296 4675 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431301 4675 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431305 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431309 4675 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431313 4675 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431317 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431321 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431325 4675 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431330 4675 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431335 4675 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431340 4675 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431345 4675 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431350 4675 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431361 4675 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431367 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431373 4675 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431380 4675 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431385 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431391 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431396 4675 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431402 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431407 4675 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431412 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431417 4675 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431423 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431428 4675 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431437 4675 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431442 4675 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431447 4675 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431452 4675 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431458 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431462 4675 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431468 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431570 4675 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431592 4675 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431600 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431605 4675 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431610 4675 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431614 4675 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431619 4675 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.431624 4675 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432171 4675 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432210 4675 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432222 4675 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432228 4675 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432235 4675 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432240 4675 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432247 4675 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432253 4675 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432257 4675 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432262 4675 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432268 4675 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432273 4675 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432278 4675 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432282 4675 flags.go:64] FLAG: --cgroup-root="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432287 4675 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432291 4675 flags.go:64] FLAG: --client-ca-file="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432296 4675 flags.go:64] FLAG: --cloud-config="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432301 4675 flags.go:64] FLAG: --cloud-provider="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432319 4675 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432327 4675 flags.go:64] FLAG: --cluster-domain="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432333 4675 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432340 4675 flags.go:64] FLAG: --config-dir="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432346 4675 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432352 4675 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432360 4675 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432365 4675 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432371 4675 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432378 4675 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432384 4675 flags.go:64] FLAG: --contention-profiling="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432390 4675 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432396 4675 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432402 4675 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432408 4675 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432416 4675 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432422 4675 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432429 4675 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432435 4675 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432441 4675 flags.go:64] FLAG: --enable-server="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432446 4675 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432456 4675 flags.go:64] FLAG: --event-burst="100" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432462 4675 flags.go:64] FLAG: --event-qps="50" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432468 4675 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432474 4675 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432479 4675 flags.go:64] FLAG: --eviction-hard="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432486 4675 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432492 4675 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432497 4675 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432502 4675 flags.go:64] FLAG: --eviction-soft="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432508 4675 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432513 4675 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432518 4675 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432524 4675 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432529 4675 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432534 4675 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432540 4675 flags.go:64] FLAG: --feature-gates="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432546 4675 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432552 4675 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432558 4675 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432563 4675 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432568 4675 flags.go:64] FLAG: --healthz-port="10248" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432575 4675 flags.go:64] FLAG: --help="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432581 4675 flags.go:64] FLAG: --hostname-override="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432586 4675 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432591 4675 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432596 4675 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432601 4675 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432606 4675 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432612 4675 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432617 4675 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432622 4675 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432627 4675 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432632 4675 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432638 4675 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432643 4675 flags.go:64] FLAG: --kube-reserved="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432647 4675 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432652 4675 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432658 4675 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432663 4675 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432669 4675 flags.go:64] FLAG: --lock-file="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432697 4675 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432702 4675 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432708 4675 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432718 4675 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432724 4675 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432729 4675 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432736 4675 flags.go:64] FLAG: --logging-format="text" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432742 4675 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432748 4675 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432753 4675 flags.go:64] FLAG: --manifest-url="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432800 4675 flags.go:64] FLAG: --manifest-url-header="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432809 4675 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432814 4675 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432820 4675 flags.go:64] FLAG: --max-pods="110" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432824 4675 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432829 4675 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432834 4675 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432839 4675 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432844 4675 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432849 4675 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432854 4675 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432876 4675 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432880 4675 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432885 4675 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432890 4675 flags.go:64] FLAG: --pod-cidr="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432895 4675 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432903 4675 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432907 4675 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432912 4675 flags.go:64] FLAG: --pods-per-core="0" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432917 4675 flags.go:64] FLAG: --port="10250" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432921 4675 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432926 4675 flags.go:64] FLAG: --provider-id="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432931 4675 flags.go:64] FLAG: --qos-reserved="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432936 4675 flags.go:64] FLAG: --read-only-port="10255" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432940 4675 flags.go:64] FLAG: --register-node="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432945 4675 flags.go:64] FLAG: --register-schedulable="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432949 4675 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432957 4675 flags.go:64] FLAG: --registry-burst="10" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432962 4675 flags.go:64] FLAG: --registry-qps="5" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432969 4675 flags.go:64] FLAG: --reserved-cpus="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432974 4675 flags.go:64] FLAG: --reserved-memory="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432982 4675 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432987 4675 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432992 4675 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.432997 4675 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433002 4675 flags.go:64] FLAG: --runonce="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433006 4675 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433011 4675 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433016 4675 flags.go:64] FLAG: --seccomp-default="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433020 4675 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433024 4675 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433029 4675 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433034 4675 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433039 4675 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433043 4675 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433048 4675 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433052 4675 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433056 4675 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433061 4675 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433066 4675 flags.go:64] FLAG: --system-cgroups="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433070 4675 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433079 4675 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433083 4675 flags.go:64] FLAG: --tls-cert-file="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433088 4675 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433095 4675 flags.go:64] FLAG: --tls-min-version="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433099 4675 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433104 4675 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433108 4675 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433113 4675 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433118 4675 flags.go:64] FLAG: --v="2" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433125 4675 flags.go:64] FLAG: --version="false" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433132 4675 flags.go:64] FLAG: --vmodule="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433138 4675 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433143 4675 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433290 4675 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433296 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433303 4675 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433308 4675 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433313 4675 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433318 4675 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433323 4675 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433329 4675 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433334 4675 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433339 4675 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433343 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433347 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433351 4675 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433356 4675 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433360 4675 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433365 4675 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433369 4675 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433374 4675 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433378 4675 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433383 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433387 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433391 4675 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433396 4675 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433400 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433405 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433408 4675 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433412 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433417 4675 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433420 4675 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433424 4675 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433428 4675 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433432 4675 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433435 4675 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433439 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433443 4675 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433447 4675 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433451 4675 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433455 4675 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433461 4675 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433466 4675 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433472 4675 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433475 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433479 4675 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433483 4675 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433489 4675 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433494 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433498 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433502 4675 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433507 4675 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433510 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433515 4675 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433518 4675 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433523 4675 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433529 4675 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433533 4675 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433537 4675 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433542 4675 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433546 4675 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433550 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433553 4675 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433557 4675 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433562 4675 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433567 4675 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433572 4675 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433576 4675 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433581 4675 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433585 4675 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433589 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433594 4675 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433598 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.433602 4675 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.433609 4675 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.442195 4675 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.442240 4675 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442322 4675 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442332 4675 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442337 4675 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442343 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442348 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442353 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442358 4675 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442362 4675 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442368 4675 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442373 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442378 4675 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442384 4675 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442391 4675 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442396 4675 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442400 4675 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442404 4675 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442408 4675 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442413 4675 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442417 4675 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442421 4675 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442425 4675 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442431 4675 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442436 4675 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442441 4675 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442446 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442451 4675 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442455 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442460 4675 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442464 4675 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442469 4675 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442474 4675 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442479 4675 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442483 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442488 4675 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442497 4675 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442501 4675 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442505 4675 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442509 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442512 4675 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442516 4675 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442520 4675 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442525 4675 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442529 4675 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442535 4675 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442544 4675 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442548 4675 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442553 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442559 4675 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442564 4675 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442569 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442573 4675 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442578 4675 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442582 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442586 4675 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442590 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442595 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442600 4675 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442604 4675 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442608 4675 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442612 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442616 4675 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442620 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442625 4675 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442629 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442633 4675 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442637 4675 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442641 4675 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442645 4675 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442649 4675 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442653 4675 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442660 4675 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.442667 4675 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442821 4675 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442829 4675 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442834 4675 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442840 4675 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442847 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442852 4675 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442856 4675 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442861 4675 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442867 4675 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442871 4675 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442876 4675 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442881 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442886 4675 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442891 4675 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442896 4675 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442900 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442904 4675 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442908 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442912 4675 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442917 4675 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442921 4675 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442925 4675 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442929 4675 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442932 4675 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442938 4675 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442944 4675 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442949 4675 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442954 4675 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442959 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442963 4675 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442967 4675 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442971 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442975 4675 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442979 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442988 4675 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442992 4675 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.442996 4675 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443002 4675 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443006 4675 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443010 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443014 4675 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443018 4675 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443021 4675 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443025 4675 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443029 4675 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443033 4675 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443038 4675 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443044 4675 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443048 4675 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443052 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443056 4675 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443060 4675 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443064 4675 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443068 4675 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443071 4675 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443075 4675 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443079 4675 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443083 4675 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443088 4675 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443092 4675 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443096 4675 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443101 4675 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443105 4675 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443109 4675 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443112 4675 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443117 4675 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443122 4675 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443127 4675 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443132 4675 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443136 4675 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.443148 4675 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.443155 4675 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.444022 4675 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.449948 4675 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.454122 4675 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.454219 4675 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.455880 4675 server.go:997] "Starting client certificate rotation" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.455915 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.456142 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.480359 4675 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.481398 4675 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.482180 4675 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.497866 4675 log.go:25] "Validated CRI v1 runtime API" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.532759 4675 log.go:25] "Validated CRI v1 image API" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.535272 4675 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.548051 4675 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-15-54-47-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.548089 4675 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.581048 4675 manager.go:217] Machine: {Timestamp:2026-03-20 16:01:20.576978971 +0000 UTC m=+0.610608588 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3fb7a7bb-d55d-430f-9fb9-3c580cf224f3 BootID:b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4e:66:55 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4e:66:55 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a0:27:b5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2b:9c:18 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d8:aa:00 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c0:ec:23 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8e:78:b2:55:79:6f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7e:67:ec:2f:b5:5c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.581433 4675 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.581733 4675 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.582622 4675 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.583019 4675 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.583088 4675 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.583402 4675 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.583420 4675 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.584166 4675 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.584219 4675 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.584435 4675 state_mem.go:36] "Initialized new in-memory state store" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.585053 4675 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.589488 4675 kubelet.go:418] "Attempting to sync node with API server" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.589519 4675 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.589559 4675 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.589585 4675 kubelet.go:324] "Adding apiserver pod source" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.589608 4675 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.593203 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.593300 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.593334 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.593435 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.596593 4675 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.597618 4675 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.600021 4675 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601698 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601722 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601730 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601737 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601748 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601755 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601779 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601828 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601837 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601846 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601872 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.601881 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.604292 4675 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.604835 4675 server.go:1280] "Started kubelet" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.605098 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:20 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.606717 4675 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.607018 4675 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.607848 4675 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.611870 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.611919 4675 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.612533 4675 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.612557 4675 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.612684 4675 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.613335 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.615412 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.615624 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615712 4675 factory.go:153] Registering CRI-O factory Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615729 4675 factory.go:221] Registration of the crio container factory successfully Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615820 4675 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615831 4675 factory.go:55] Registering systemd factory Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615845 4675 factory.go:221] Registration of the systemd container factory successfully Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.615858 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="200ms" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615910 4675 factory.go:103] Registering Raw factory Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.615931 4675 manager.go:1196] Started watching for new ooms in manager Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.622416 4675 manager.go:319] Starting recovery of all containers Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.623309 4675 server.go:460] "Adding debug handlers to kubelet server" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.622390 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e980c0401a933 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,LastTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629126 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629267 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629290 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629309 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629327 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629345 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629364 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629381 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629403 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629420 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629438 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629459 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629477 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629504 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629557 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629587 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629613 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629637 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629663 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629687 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629712 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629732 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629788 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629808 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629830 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629848 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629869 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629889 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.629909 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632319 4675 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632373 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632401 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632422 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632445 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632466 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632484 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632503 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632523 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632542 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632562 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632680 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632705 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632727 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632746 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632792 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632812 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632831 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632856 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632875 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632896 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632932 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632952 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.632989 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633015 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633036 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633056 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633079 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633288 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633308 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633327 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633346 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633364 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633384 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633405 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633424 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633443 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633501 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633530 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633555 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633581 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633605 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633631 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633655 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633677 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633696 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633715 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633735 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633781 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633799 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633825 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633844 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633862 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633880 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633900 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633920 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633939 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633959 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633980 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.633998 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634046 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634067 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634272 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634306 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634325 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634344 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634364 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634382 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634403 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634425 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634444 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634462 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634482 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634501 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634519 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634538 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634571 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634591 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634613 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634633 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634653 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634672 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634694 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634714 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634734 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634754 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634799 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634818 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634836 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.634854 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.637688 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.637847 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.637884 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.637932 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.637962 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638059 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638095 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638131 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638145 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638156 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638170 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638181 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638194 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638205 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638215 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638231 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638241 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638254 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638265 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638276 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638290 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638720 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638734 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638745 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638760 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638812 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638826 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638836 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638846 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638858 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638926 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.638966 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639001 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639023 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639044 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639072 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639096 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639125 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639146 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639171 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639198 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639219 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639246 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639296 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639346 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639367 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639381 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639394 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639412 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639426 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639457 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639470 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639483 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639501 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639515 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639528 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639598 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639677 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639695 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639718 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639755 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639838 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639869 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639884 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639904 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639919 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639936 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639948 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639960 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639978 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.639991 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640003 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640043 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640655 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640690 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640702 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640713 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640725 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640738 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640749 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640760 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640795 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640808 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640821 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640833 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640845 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640857 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640870 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640882 4675 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640896 4675 reconstruct.go:97] "Volume reconstruction finished" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.640905 4675 reconciler.go:26] "Reconciler: start to sync state" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.645284 4675 manager.go:324] Recovery completed Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.654192 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.655623 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.655678 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.655689 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.656524 4675 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.656550 4675 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.656573 4675 state_mem.go:36] "Initialized new in-memory state store" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.669694 4675 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.671518 4675 policy_none.go:49] "None policy: Start" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.672049 4675 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.672420 4675 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.672471 4675 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.672530 4675 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 16:01:20 crc kubenswrapper[4675]: W0320 16:01:20.673031 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.673089 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.673237 4675 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.673268 4675 state_mem.go:35] "Initializing new in-memory state store" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.713942 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.738325 4675 manager.go:334] "Starting Device Plugin manager" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.738659 4675 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.738684 4675 server.go:79] "Starting device plugin registration server" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.739197 4675 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.739219 4675 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.739651 4675 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.739740 4675 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.739750 4675 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.753221 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.772690 4675 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.772802 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.774083 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.774121 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.774134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.774260 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.774458 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.774526 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.775126 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.775168 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.775186 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.775417 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.775612 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.775659 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776479 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776528 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776542 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776746 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776821 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.776937 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777096 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777137 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777530 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777784 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777800 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.777917 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778083 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778115 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778143 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778150 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778166 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.778879 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.779557 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.779598 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.779618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.780046 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.780084 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.781341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.781366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.781377 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.817092 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="400ms" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.839593 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.841460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.841558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.841589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.841631 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:20 crc kubenswrapper[4675]: E0320 16:01:20.842597 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.845033 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.845310 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.845519 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.845800 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.845930 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846003 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846050 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846137 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846187 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846233 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846417 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846439 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846521 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846580 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.846604 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947662 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947739 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947827 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947871 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947910 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947931 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.947962 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948010 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948098 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948033 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948054 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948287 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948077 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948032 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948078 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948265 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948476 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948528 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948579 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948611 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948640 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948658 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948681 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948717 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948643 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948686 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948672 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948721 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948840 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:20 crc kubenswrapper[4675]: I0320 16:01:20.948970 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.043422 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.045099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.045372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.045540 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.045740 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:21 crc kubenswrapper[4675]: E0320 16:01:21.046537 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.117436 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.146484 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.164659 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.174268 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.179978 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.199954 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-33ba7363aad09e8580cf13473a2e24a8771550395c32049cd8924c8498ea289c WatchSource:0}: Error finding container 33ba7363aad09e8580cf13473a2e24a8771550395c32049cd8924c8498ea289c: Status 404 returned error can't find the container with id 33ba7363aad09e8580cf13473a2e24a8771550395c32049cd8924c8498ea289c Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.204007 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-288990e619bfe954fcc09b991433cd35e7d03b9639f2fd83b9a577146f2abd71 WatchSource:0}: Error finding container 288990e619bfe954fcc09b991433cd35e7d03b9639f2fd83b9a577146f2abd71: Status 404 returned error can't find the container with id 288990e619bfe954fcc09b991433cd35e7d03b9639f2fd83b9a577146f2abd71 Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.209107 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f914bc17f5780372a876f5ce8b5df59b9284a022e818a7c3b10deade044e048b WatchSource:0}: Error finding container f914bc17f5780372a876f5ce8b5df59b9284a022e818a7c3b10deade044e048b: Status 404 returned error can't find the container with id f914bc17f5780372a876f5ce8b5df59b9284a022e818a7c3b10deade044e048b Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.213787 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-5e47dd131ffb477ff8d9648b6b3c32e4215c31b963ce1d5ecd726a2f1d8e5a84 WatchSource:0}: Error finding container 5e47dd131ffb477ff8d9648b6b3c32e4215c31b963ce1d5ecd726a2f1d8e5a84: Status 404 returned error can't find the container with id 5e47dd131ffb477ff8d9648b6b3c32e4215c31b963ce1d5ecd726a2f1d8e5a84 Mar 20 16:01:21 crc kubenswrapper[4675]: E0320 16:01:21.217942 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="800ms" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.447019 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.448336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.448385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.448398 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.448427 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:21 crc kubenswrapper[4675]: E0320 16:01:21.448951 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.558196 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:21 crc kubenswrapper[4675]: E0320 16:01:21.558321 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.607025 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.628010 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:21 crc kubenswrapper[4675]: E0320 16:01:21.628146 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.678669 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5e47dd131ffb477ff8d9648b6b3c32e4215c31b963ce1d5ecd726a2f1d8e5a84"} Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.680534 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f914bc17f5780372a876f5ce8b5df59b9284a022e818a7c3b10deade044e048b"} Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.681601 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"288990e619bfe954fcc09b991433cd35e7d03b9639f2fd83b9a577146f2abd71"} Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.683062 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33ba7363aad09e8580cf13473a2e24a8771550395c32049cd8924c8498ea289c"} Mar 20 16:01:21 crc kubenswrapper[4675]: I0320 16:01:21.684645 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7234e19f3adb1ceefd7e84dac7efe7105021f29a9fa09eca74d0d18182e417ed"} Mar 20 16:01:21 crc kubenswrapper[4675]: W0320 16:01:21.921684 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:21 crc kubenswrapper[4675]: E0320 16:01:21.921756 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:22 crc kubenswrapper[4675]: E0320 16:01:22.019022 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="1.6s" Mar 20 16:01:22 crc kubenswrapper[4675]: W0320 16:01:22.198121 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:22 crc kubenswrapper[4675]: E0320 16:01:22.198209 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.249854 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.251828 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.251867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.251876 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.251900 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:22 crc kubenswrapper[4675]: E0320 16:01:22.252370 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.605869 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.658082 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:22 crc kubenswrapper[4675]: E0320 16:01:22.659032 4675 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.691620 4675 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c" exitCode=0 Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.691744 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.691744 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.692976 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.693040 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.693058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.693926 4675 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec" exitCode=0 Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.694003 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.694072 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.695266 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.695310 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.695328 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.697258 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.697370 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.697379 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.697395 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.697407 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.698428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.698456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.698467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.699943 4675 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c" exitCode=0 Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.700027 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.700030 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.701197 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.701238 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.701256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.702151 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82" exitCode=0 Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.702193 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82"} Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.702261 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.703117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.703142 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.703151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.704542 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.705573 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.705599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.705608 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.941382 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.941512 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" start-of-body= Mar 20 16:01:22 crc kubenswrapper[4675]: I0320 16:01:22.941551 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.605845 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:23 crc kubenswrapper[4675]: E0320 16:01:23.620637 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="3.2s" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.706483 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.706513 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.707559 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.707609 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.707621 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.709490 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.709509 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.709511 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.709560 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.710037 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.710061 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.710069 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.713062 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.713093 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.713102 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.713113 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.714496 4675 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4" exitCode=0 Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.714587 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715038 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715303 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4"} Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715583 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715902 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715918 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.715927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.853199 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.854321 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.854356 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.854369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:23 crc kubenswrapper[4675]: I0320 16:01:23.854394 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:23 crc kubenswrapper[4675]: E0320 16:01:23.854816 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Mar 20 16:01:23 crc kubenswrapper[4675]: W0320 16:01:23.964257 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:23 crc kubenswrapper[4675]: E0320 16:01:23.964342 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:24 crc kubenswrapper[4675]: W0320 16:01:24.104520 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:24 crc kubenswrapper[4675]: E0320 16:01:24.104641 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:24 crc kubenswrapper[4675]: W0320 16:01:24.213017 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Mar 20 16:01:24 crc kubenswrapper[4675]: E0320 16:01:24.213095 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.720047 4675 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50" exitCode=0 Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.721970 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50"} Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.722218 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.725136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.725189 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.725210 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.726718 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.727151 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df381a0976417b4496efedeef3c36eb6f409252ce2f53fb7e35d72a28e9caa79"} Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.727229 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.727234 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.727290 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.727994 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728624 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728614 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728670 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728691 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728629 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728732 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728746 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728857 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728878 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:24 crc kubenswrapper[4675]: I0320 16:01:24.728638 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.733841 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b"} Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.733887 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.733925 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.733952 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.733891 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f"} Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734051 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8"} Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734062 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd"} Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734071 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73"} Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734855 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734877 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734938 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734958 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.734967 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.894984 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.895163 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.896263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.896309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.896321 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:25 crc kubenswrapper[4675]: I0320 16:01:25.905006 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.301189 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.389756 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.737057 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.737089 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.737115 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.740861 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.740919 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.740935 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.741034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.741070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.741082 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.740912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.741149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:26 crc kubenswrapper[4675]: I0320 16:01:26.741164 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.035285 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.055761 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.056856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.056892 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.056901 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.056923 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.276995 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.343702 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.742513 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.742657 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.742666 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745042 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745118 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745185 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745186 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745130 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.745242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4675]: I0320 16:01:27.918630 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.744584 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.744694 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.745402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.745436 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.745447 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.746465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.746496 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:28 crc kubenswrapper[4675]: I0320 16:01:28.746505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:30 crc kubenswrapper[4675]: I0320 16:01:30.299202 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:30 crc kubenswrapper[4675]: I0320 16:01:30.299370 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:30 crc kubenswrapper[4675]: I0320 16:01:30.300812 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:30 crc kubenswrapper[4675]: I0320 16:01:30.300842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:30 crc kubenswrapper[4675]: I0320 16:01:30.300851 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:30 crc kubenswrapper[4675]: E0320 16:01:30.753350 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:33 crc kubenswrapper[4675]: I0320 16:01:33.857469 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 16:01:33 crc kubenswrapper[4675]: I0320 16:01:33.857734 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:33 crc kubenswrapper[4675]: I0320 16:01:33.859286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:33 crc kubenswrapper[4675]: I0320 16:01:33.859335 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:33 crc kubenswrapper[4675]: I0320 16:01:33.859353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.607717 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.764136 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.766489 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="df381a0976417b4496efedeef3c36eb6f409252ce2f53fb7e35d72a28e9caa79" exitCode=255 Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.766561 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"df381a0976417b4496efedeef3c36eb6f409252ce2f53fb7e35d72a28e9caa79"} Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.766903 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.768470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.768512 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.768525 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.769190 4675 scope.go:117] "RemoveContainer" containerID="df381a0976417b4496efedeef3c36eb6f409252ce2f53fb7e35d72a28e9caa79" Mar 20 16:01:34 crc kubenswrapper[4675]: W0320 16:01:34.830850 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.830988 4675 trace.go:236] Trace[1186511916]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 16:01:24.829) (total time: 10001ms): Mar 20 16:01:34 crc kubenswrapper[4675]: Trace[1186511916]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:01:34.830) Mar 20 16:01:34 crc kubenswrapper[4675]: Trace[1186511916]: [10.001460229s] [10.001460229s] END Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.831025 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 16:01:34 crc kubenswrapper[4675]: W0320 16:01:34.928039 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.928157 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.932147 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:34 crc kubenswrapper[4675]: W0320 16:01:34.933526 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.933617 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.934328 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.934597 4675 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.934645 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.935409 4675 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.940403 4675 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:01:34 crc kubenswrapper[4675]: I0320 16:01:34.940494 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.940927 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e980c0401a933 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,LastTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:34 crc kubenswrapper[4675]: W0320 16:01:34.941375 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z Mar 20 16:01:34 crc kubenswrapper[4675]: E0320 16:01:34.941486 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.611030 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:35Z is after 2026-02-23T05:33:13Z Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.772911 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.773564 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.776200 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" exitCode=255 Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.776235 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee"} Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.776315 4675 scope.go:117] "RemoveContainer" containerID="df381a0976417b4496efedeef3c36eb6f409252ce2f53fb7e35d72a28e9caa79" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.776474 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.777792 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.777820 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.777831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.778367 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:35 crc kubenswrapper[4675]: E0320 16:01:35.778535 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.942539 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:35 crc kubenswrapper[4675]: I0320 16:01:35.942660 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.301205 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.394515 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.394668 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.396122 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.396189 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.396233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.609190 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:36Z is after 2026-02-23T05:33:13Z Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.781500 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.784247 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.785444 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.785478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.785490 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:36 crc kubenswrapper[4675]: I0320 16:01:36.786074 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:36 crc kubenswrapper[4675]: E0320 16:01:36.786283 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.285484 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.306378 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.611044 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:37Z is after 2026-02-23T05:33:13Z Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.786611 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.787360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.787392 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.787401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.787920 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:37 crc kubenswrapper[4675]: E0320 16:01:37.788100 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:37 crc kubenswrapper[4675]: I0320 16:01:37.793907 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:38 crc kubenswrapper[4675]: I0320 16:01:38.608794 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:38Z is after 2026-02-23T05:33:13Z Mar 20 16:01:38 crc kubenswrapper[4675]: I0320 16:01:38.789287 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:38 crc kubenswrapper[4675]: I0320 16:01:38.790464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:38 crc kubenswrapper[4675]: I0320 16:01:38.790506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:38 crc kubenswrapper[4675]: I0320 16:01:38.790517 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:38 crc kubenswrapper[4675]: I0320 16:01:38.791151 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:38 crc kubenswrapper[4675]: E0320 16:01:38.791378 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:39 crc kubenswrapper[4675]: I0320 16:01:39.612225 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:39Z is after 2026-02-23T05:33:13Z Mar 20 16:01:39 crc kubenswrapper[4675]: I0320 16:01:39.792414 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:39 crc kubenswrapper[4675]: I0320 16:01:39.793491 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:39 crc kubenswrapper[4675]: I0320 16:01:39.793562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:39 crc kubenswrapper[4675]: I0320 16:01:39.793577 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:39 crc kubenswrapper[4675]: I0320 16:01:39.794391 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:39 crc kubenswrapper[4675]: E0320 16:01:39.794610 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:40 crc kubenswrapper[4675]: I0320 16:01:40.610580 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:40Z is after 2026-02-23T05:33:13Z Mar 20 16:01:40 crc kubenswrapper[4675]: E0320 16:01:40.753458 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:41 crc kubenswrapper[4675]: W0320 16:01:41.041358 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z Mar 20 16:01:41 crc kubenswrapper[4675]: E0320 16:01:41.041428 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:41 crc kubenswrapper[4675]: I0320 16:01:41.332311 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:41 crc kubenswrapper[4675]: I0320 16:01:41.333737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:41 crc kubenswrapper[4675]: I0320 16:01:41.333797 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:41 crc kubenswrapper[4675]: I0320 16:01:41.333807 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:41 crc kubenswrapper[4675]: I0320 16:01:41.333834 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:41 crc kubenswrapper[4675]: E0320 16:01:41.336575 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:41 crc kubenswrapper[4675]: E0320 16:01:41.338308 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:41 crc kubenswrapper[4675]: I0320 16:01:41.609745 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z Mar 20 16:01:41 crc kubenswrapper[4675]: W0320 16:01:41.657263 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z Mar 20 16:01:41 crc kubenswrapper[4675]: E0320 16:01:41.657415 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:42 crc kubenswrapper[4675]: W0320 16:01:42.373562 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:42Z is after 2026-02-23T05:33:13Z Mar 20 16:01:42 crc kubenswrapper[4675]: E0320 16:01:42.373743 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:42 crc kubenswrapper[4675]: I0320 16:01:42.609460 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:42Z is after 2026-02-23T05:33:13Z Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.611537 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:43Z is after 2026-02-23T05:33:13Z Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.672597 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:43 crc kubenswrapper[4675]: E0320 16:01:43.677073 4675 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.889613 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.889820 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.890895 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.890940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.890952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:43 crc kubenswrapper[4675]: I0320 16:01:43.904955 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 16:01:44 crc kubenswrapper[4675]: W0320 16:01:44.302859 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:44Z is after 2026-02-23T05:33:13Z Mar 20 16:01:44 crc kubenswrapper[4675]: E0320 16:01:44.302972 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:44 crc kubenswrapper[4675]: I0320 16:01:44.611387 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:44Z is after 2026-02-23T05:33:13Z Mar 20 16:01:44 crc kubenswrapper[4675]: I0320 16:01:44.804644 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:44 crc kubenswrapper[4675]: I0320 16:01:44.806291 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:44 crc kubenswrapper[4675]: I0320 16:01:44.806338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:44 crc kubenswrapper[4675]: I0320 16:01:44.806350 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:44 crc kubenswrapper[4675]: E0320 16:01:44.945136 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:44Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e980c0401a933 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,LastTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.608480 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.942090 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.942182 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.942310 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.942624 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.944046 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.944094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.944105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.944710 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 16:01:45 crc kubenswrapper[4675]: I0320 16:01:45.944925 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b" gracePeriod=30 Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.608982 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:46Z is after 2026-02-23T05:33:13Z Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.818630 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.819389 4675 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b" exitCode=255 Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.819440 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b"} Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.819467 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704"} Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.819557 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.820591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.820629 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:46 crc kubenswrapper[4675]: I0320 16:01:46.820642 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:47 crc kubenswrapper[4675]: I0320 16:01:47.609282 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:47Z is after 2026-02-23T05:33:13Z Mar 20 16:01:48 crc kubenswrapper[4675]: I0320 16:01:48.337675 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:48 crc kubenswrapper[4675]: I0320 16:01:48.338921 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:48 crc kubenswrapper[4675]: I0320 16:01:48.338954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:48 crc kubenswrapper[4675]: I0320 16:01:48.338965 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:48 crc kubenswrapper[4675]: I0320 16:01:48.338988 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:48 crc kubenswrapper[4675]: E0320 16:01:48.341138 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:48Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:48 crc kubenswrapper[4675]: E0320 16:01:48.342572 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:48Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:48 crc kubenswrapper[4675]: I0320 16:01:48.609097 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:48Z is after 2026-02-23T05:33:13Z Mar 20 16:01:49 crc kubenswrapper[4675]: I0320 16:01:49.610837 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:49Z is after 2026-02-23T05:33:13Z Mar 20 16:01:50 crc kubenswrapper[4675]: I0320 16:01:50.299597 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:50 crc kubenswrapper[4675]: I0320 16:01:50.299784 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:50 crc kubenswrapper[4675]: I0320 16:01:50.300828 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:50 crc kubenswrapper[4675]: I0320 16:01:50.300862 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:50 crc kubenswrapper[4675]: I0320 16:01:50.300872 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:50 crc kubenswrapper[4675]: I0320 16:01:50.611560 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:50Z is after 2026-02-23T05:33:13Z Mar 20 16:01:50 crc kubenswrapper[4675]: E0320 16:01:50.753741 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:51 crc kubenswrapper[4675]: I0320 16:01:51.608294 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:51Z is after 2026-02-23T05:33:13Z Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.610707 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:52Z is after 2026-02-23T05:33:13Z Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.673272 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.674906 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.674961 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.674972 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.675538 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.942008 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.942268 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.944232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.944276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:52 crc kubenswrapper[4675]: I0320 16:01:52.944286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:53 crc kubenswrapper[4675]: W0320 16:01:53.218356 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:53Z is after 2026-02-23T05:33:13Z Mar 20 16:01:53 crc kubenswrapper[4675]: E0320 16:01:53.218461 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.611001 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:53Z is after 2026-02-23T05:33:13Z Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.840541 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.842194 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706"} Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.842362 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.843213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.843248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:53 crc kubenswrapper[4675]: I0320 16:01:53.843258 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.608756 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:54Z is after 2026-02-23T05:33:13Z Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.846781 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.847435 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.849720 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" exitCode=255 Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.849784 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706"} Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.849831 4675 scope.go:117] "RemoveContainer" containerID="dca8adb9b40ea41dcefff58ff5d6b96e3fc2c2306d21cdb40dd7f2f002d5b5ee" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.849984 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.850882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.850922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.850932 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:54 crc kubenswrapper[4675]: I0320 16:01:54.851569 4675 scope.go:117] "RemoveContainer" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" Mar 20 16:01:54 crc kubenswrapper[4675]: E0320 16:01:54.851746 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:54 crc kubenswrapper[4675]: E0320 16:01:54.948888 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:54Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e980c0401a933 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,LastTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.342645 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.344547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.344587 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.344601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.344629 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:55 crc kubenswrapper[4675]: E0320 16:01:55.346485 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:55Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:55 crc kubenswrapper[4675]: E0320 16:01:55.349667 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.611716 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:55Z is after 2026-02-23T05:33:13Z Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.854153 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.942812 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:55 crc kubenswrapper[4675]: I0320 16:01:55.942911 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.301889 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.302122 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.303461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.303502 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.303513 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.304054 4675 scope.go:117] "RemoveContainer" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" Mar 20 16:01:56 crc kubenswrapper[4675]: E0320 16:01:56.304217 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:56 crc kubenswrapper[4675]: I0320 16:01:56.610188 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:56Z is after 2026-02-23T05:33:13Z Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.306872 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.307071 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.309688 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.309713 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.309722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.310198 4675 scope.go:117] "RemoveContainer" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" Mar 20 16:01:57 crc kubenswrapper[4675]: E0320 16:01:57.310353 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:57 crc kubenswrapper[4675]: I0320 16:01:57.609147 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:57Z is after 2026-02-23T05:33:13Z Mar 20 16:01:58 crc kubenswrapper[4675]: W0320 16:01:58.131447 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:58Z is after 2026-02-23T05:33:13Z Mar 20 16:01:58 crc kubenswrapper[4675]: E0320 16:01:58.131552 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:58 crc kubenswrapper[4675]: I0320 16:01:58.609629 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:58Z is after 2026-02-23T05:33:13Z Mar 20 16:01:58 crc kubenswrapper[4675]: W0320 16:01:58.784590 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:58Z is after 2026-02-23T05:33:13Z Mar 20 16:01:58 crc kubenswrapper[4675]: E0320 16:01:58.784670 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:59 crc kubenswrapper[4675]: I0320 16:01:59.609987 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:59Z is after 2026-02-23T05:33:13Z Mar 20 16:02:00 crc kubenswrapper[4675]: I0320 16:02:00.608606 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:00Z is after 2026-02-23T05:33:13Z Mar 20 16:02:00 crc kubenswrapper[4675]: E0320 16:02:00.753991 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:02:01 crc kubenswrapper[4675]: I0320 16:02:01.198870 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:02:01 crc kubenswrapper[4675]: E0320 16:02:01.204956 4675 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:02:01 crc kubenswrapper[4675]: E0320 16:02:01.206214 4675 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 16:02:01 crc kubenswrapper[4675]: I0320 16:02:01.610972 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:01Z is after 2026-02-23T05:33:13Z Mar 20 16:02:02 crc kubenswrapper[4675]: E0320 16:02:02.349712 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:02Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:02:02 crc kubenswrapper[4675]: I0320 16:02:02.349791 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:02 crc kubenswrapper[4675]: I0320 16:02:02.350927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:02 crc kubenswrapper[4675]: I0320 16:02:02.350962 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:02 crc kubenswrapper[4675]: I0320 16:02:02.350973 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:02 crc kubenswrapper[4675]: I0320 16:02:02.351005 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:02 crc kubenswrapper[4675]: E0320 16:02:02.354512 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:02Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:02:02 crc kubenswrapper[4675]: I0320 16:02:02.609194 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:02Z is after 2026-02-23T05:33:13Z Mar 20 16:02:03 crc kubenswrapper[4675]: I0320 16:02:03.608361 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:03Z is after 2026-02-23T05:33:13Z Mar 20 16:02:04 crc kubenswrapper[4675]: I0320 16:02:04.609510 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:04Z is after 2026-02-23T05:33:13Z Mar 20 16:02:04 crc kubenswrapper[4675]: E0320 16:02:04.955988 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:04Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e980c0401a933 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,LastTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:05 crc kubenswrapper[4675]: I0320 16:02:05.609519 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:05Z is after 2026-02-23T05:33:13Z Mar 20 16:02:05 crc kubenswrapper[4675]: I0320 16:02:05.942484 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:02:05 crc kubenswrapper[4675]: I0320 16:02:05.942570 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:02:06 crc kubenswrapper[4675]: I0320 16:02:06.610178 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:06 crc kubenswrapper[4675]: W0320 16:02:06.775228 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:06 crc kubenswrapper[4675]: E0320 16:02:06.775293 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 16:02:07 crc kubenswrapper[4675]: I0320 16:02:07.610277 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:08 crc kubenswrapper[4675]: I0320 16:02:08.609528 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.354826 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.356164 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.356248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.356294 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.356344 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:09 crc kubenswrapper[4675]: E0320 16:02:09.358236 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:02:09 crc kubenswrapper[4675]: E0320 16:02:09.358268 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.612306 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.673470 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.674708 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.674758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.674796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4675]: I0320 16:02:09.675438 4675 scope.go:117] "RemoveContainer" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" Mar 20 16:02:09 crc kubenswrapper[4675]: E0320 16:02:09.675633 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:10 crc kubenswrapper[4675]: W0320 16:02:10.186917 4675 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 16:02:10 crc kubenswrapper[4675]: E0320 16:02:10.186996 4675 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 16:02:10 crc kubenswrapper[4675]: I0320 16:02:10.612476 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:10 crc kubenswrapper[4675]: E0320 16:02:10.754280 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:02:11 crc kubenswrapper[4675]: I0320 16:02:11.611236 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:12 crc kubenswrapper[4675]: I0320 16:02:12.612391 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:13 crc kubenswrapper[4675]: I0320 16:02:13.180164 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:02:13 crc kubenswrapper[4675]: I0320 16:02:13.180685 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:13 crc kubenswrapper[4675]: I0320 16:02:13.181888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4675]: I0320 16:02:13.181958 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4675]: I0320 16:02:13.181971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4675]: I0320 16:02:13.610952 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:14 crc kubenswrapper[4675]: I0320 16:02:14.610669 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.962678 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0401a933 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,LastTimestamp:2026-03-20 16:01:20.604801331 +0000 UTC m=+0.638430878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.971571 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.976993 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.982295 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.987154 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0c8c796b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.748116331 +0000 UTC m=+0.781745868,LastTimestamp:2026-03-20 16:01:20.748116331 +0000 UTC m=+0.781745868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.989102 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.774106214 +0000 UTC m=+0.807735761,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.993704 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.774130465 +0000 UTC m=+0.807760012,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:14 crc kubenswrapper[4675]: E0320 16:02:14.997557 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a3d3f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.774140986 +0000 UTC m=+0.807770533,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.002738 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.775151757 +0000 UTC m=+0.808781324,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.007879 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.775180308 +0000 UTC m=+0.808809875,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.013300 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a3d3f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.775229379 +0000 UTC m=+0.808858956,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.019683 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.776512639 +0000 UTC m=+0.810142176,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.024881 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.776537679 +0000 UTC m=+0.810167216,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.030259 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a3d3f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.77654886 +0000 UTC m=+0.810178397,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.035194 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.776794707 +0000 UTC m=+0.810424264,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.040521 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.776814368 +0000 UTC m=+0.810443915,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.045354 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a3d3f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.776829548 +0000 UTC m=+0.810459105,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.052672 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.777512249 +0000 UTC m=+0.811141826,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.057033 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.77754283 +0000 UTC m=+0.811172407,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.061180 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a3d3f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.777562561 +0000 UTC m=+0.811192128,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.065134 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.777758317 +0000 UTC m=+0.811387874,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.068538 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.777795238 +0000 UTC m=+0.811424795,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.073205 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a3d3f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a3d3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655695167 +0000 UTC m=+0.689324704,LastTimestamp:2026-03-20 16:01:20.777808828 +0000 UTC m=+0.811438385,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.079118 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c0709b80a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c0709b80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655661066 +0000 UTC m=+0.689290603,LastTimestamp:2026-03-20 16:01:20.778134019 +0000 UTC m=+0.811763596,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.082648 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e980c070a1805\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980c070a1805 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:20.655685637 +0000 UTC m=+0.689315174,LastTimestamp:2026-03-20 16:01:20.778161159 +0000 UTC m=+0.811790726,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.088589 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980c268d009c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.184358556 +0000 UTC m=+1.217988123,LastTimestamp:2026-03-20 16:01:21.184358556 +0000 UTC m=+1.217988123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.092696 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980c27aa2fd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.203048406 +0000 UTC m=+1.236677953,LastTimestamp:2026-03-20 16:01:21.203048406 +0000 UTC m=+1.236677953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.094618 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c282a7ad0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.211456208 +0000 UTC m=+1.245085755,LastTimestamp:2026-03-20 16:01:21.211456208 +0000 UTC m=+1.245085755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.102157 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c288a0721 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.217718049 +0000 UTC m=+1.251347596,LastTimestamp:2026-03-20 16:01:21.217718049 +0000 UTC m=+1.251347596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.106711 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c28a6323f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.219564095 +0000 UTC m=+1.253193632,LastTimestamp:2026-03-20 16:01:21.219564095 +0000 UTC m=+1.253193632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.111761 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c5167e115 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.903345941 +0000 UTC m=+1.936975478,LastTimestamp:2026-03-20 16:01:21.903345941 +0000 UTC m=+1.936975478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.115861 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c51b4fa2d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.908398637 +0000 UTC m=+1.942028174,LastTimestamp:2026-03-20 16:01:21.908398637 +0000 UTC m=+1.942028174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.120571 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c51c2f8d7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.909315799 +0000 UTC m=+1.942945346,LastTimestamp:2026-03-20 16:01:21.909315799 +0000 UTC m=+1.942945346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.125025 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980c51cdc490 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.910023312 +0000 UTC m=+1.943652869,LastTimestamp:2026-03-20 16:01:21.910023312 +0000 UTC m=+1.943652869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.129618 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980c51f869a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.912818082 +0000 UTC m=+1.946447619,LastTimestamp:2026-03-20 16:01:21.912818082 +0000 UTC m=+1.946447619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.134242 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c52185ecc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.91491246 +0000 UTC m=+1.948542007,LastTimestamp:2026-03-20 16:01:21.91491246 +0000 UTC m=+1.948542007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.139619 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c522fde6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.916452461 +0000 UTC m=+1.950082018,LastTimestamp:2026-03-20 16:01:21.916452461 +0000 UTC m=+1.950082018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.145432 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c526322e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.919812328 +0000 UTC m=+1.953441865,LastTimestamp:2026-03-20 16:01:21.919812328 +0000 UTC m=+1.953441865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.150376 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c52724116 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.920803094 +0000 UTC m=+1.954432621,LastTimestamp:2026-03-20 16:01:21.920803094 +0000 UTC m=+1.954432621,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.154520 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980c52ec6c4b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.928809547 +0000 UTC m=+1.962439084,LastTimestamp:2026-03-20 16:01:21.928809547 +0000 UTC m=+1.962439084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.158667 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980c52f47d3b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.929338171 +0000 UTC m=+1.962967708,LastTimestamp:2026-03-20 16:01:21.929338171 +0000 UTC m=+1.962967708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.163364 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c67170c19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.267147289 +0000 UTC m=+2.300776826,LastTimestamp:2026-03-20 16:01:22.267147289 +0000 UTC m=+2.300776826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.168098 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c67c1328c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.278298252 +0000 UTC m=+2.311927789,LastTimestamp:2026-03-20 16:01:22.278298252 +0000 UTC m=+2.311927789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.173127 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c67d6b601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.279708161 +0000 UTC m=+2.313337698,LastTimestamp:2026-03-20 16:01:22.279708161 +0000 UTC m=+2.313337698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.178604 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c714069eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.437630443 +0000 UTC m=+2.471259980,LastTimestamp:2026-03-20 16:01:22.437630443 +0000 UTC m=+2.471259980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.182377 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c72637020 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.456703008 +0000 UTC m=+2.490332545,LastTimestamp:2026-03-20 16:01:22.456703008 +0000 UTC m=+2.490332545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.186341 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c72733487 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.457736327 +0000 UTC m=+2.491365864,LastTimestamp:2026-03-20 16:01:22.457736327 +0000 UTC m=+2.491365864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.189978 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c7c8bfd80 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.6271328 +0000 UTC m=+2.660762337,LastTimestamp:2026-03-20 16:01:22.6271328 +0000 UTC m=+2.660762337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.194039 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c7db7fc4c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.646793292 +0000 UTC m=+2.680422839,LastTimestamp:2026-03-20 16:01:22.646793292 +0000 UTC m=+2.680422839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.197956 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980c80936f5f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.694729567 +0000 UTC m=+2.728359114,LastTimestamp:2026-03-20 16:01:22.694729567 +0000 UTC m=+2.728359114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.199686 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980c80b41ce1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.696871137 +0000 UTC m=+2.730500684,LastTimestamp:2026-03-20 16:01:22.696871137 +0000 UTC m=+2.730500684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.201867 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c81148772 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.703189874 +0000 UTC m=+2.736819431,LastTimestamp:2026-03-20 16:01:22.703189874 +0000 UTC m=+2.736819431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.203851 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c81270b1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.704403228 +0000 UTC m=+2.738032765,LastTimestamp:2026-03-20 16:01:22.704403228 +0000 UTC m=+2.738032765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.205405 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c8ec35c4c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.932751436 +0000 UTC m=+2.966380973,LastTimestamp:2026-03-20 16:01:22.932751436 +0000 UTC m=+2.966380973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.208023 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980c8ee6ebcb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.935081931 +0000 UTC m=+2.968711468,LastTimestamp:2026-03-20 16:01:22.935081931 +0000 UTC m=+2.968711468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.209251 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980c8ee8424e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.935169614 +0000 UTC m=+2.968799151,LastTimestamp:2026-03-20 16:01:22.935169614 +0000 UTC m=+2.968799151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.212228 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c8eeaf287 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.935345799 +0000 UTC m=+2.968975336,LastTimestamp:2026-03-20 16:01:22.935345799 +0000 UTC m=+2.968975336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.216567 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-controller-manager-crc.189e980c8f497ef5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": dial tcp 192.168.126.11:10357: connect: connection refused Mar 20 16:02:15 crc kubenswrapper[4675]: body: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.941542133 +0000 UTC m=+2.975171670,LastTimestamp:2026-03-20 16:01:22.941542133 +0000 UTC m=+2.975171670,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.220214 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c8f4a584e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.941597774 +0000 UTC m=+2.975227311,LastTimestamp:2026-03-20 16:01:22.941597774 +0000 UTC m=+2.975227311,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.223634 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c8f745691 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.944349841 +0000 UTC m=+2.977979378,LastTimestamp:2026-03-20 16:01:22.944349841 +0000 UTC m=+2.977979378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.227048 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c8f81c088 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.945228936 +0000 UTC m=+2.978858473,LastTimestamp:2026-03-20 16:01:22.945228936 +0000 UTC m=+2.978858473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.230795 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980c8fd5d547 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.950739271 +0000 UTC m=+2.984368808,LastTimestamp:2026-03-20 16:01:22.950739271 +0000 UTC m=+2.984368808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.236063 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c9073ca09 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.961091081 +0000 UTC m=+2.994720628,LastTimestamp:2026-03-20 16:01:22.961091081 +0000 UTC m=+2.994720628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.241073 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c90932a9b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.963147419 +0000 UTC m=+2.996776956,LastTimestamp:2026-03-20 16:01:22.963147419 +0000 UTC m=+2.996776956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.245078 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980c90a09a64 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.964028004 +0000 UTC m=+2.997657551,LastTimestamp:2026-03-20 16:01:22.964028004 +0000 UTC m=+2.997657551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.249258 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c9b84b8e4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.14675018 +0000 UTC m=+3.180379717,LastTimestamp:2026-03-20 16:01:23.14675018 +0000 UTC m=+3.180379717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.253434 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c9cc42d94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.167686036 +0000 UTC m=+3.201315573,LastTimestamp:2026-03-20 16:01:23.167686036 +0000 UTC m=+3.201315573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.257451 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c9cea8cdc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.170200796 +0000 UTC m=+3.203830333,LastTimestamp:2026-03-20 16:01:23.170200796 +0000 UTC m=+3.203830333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.261754 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980c9cfa8b63 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.171248995 +0000 UTC m=+3.204878532,LastTimestamp:2026-03-20 16:01:23.171248995 +0000 UTC m=+3.204878532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.265444 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c9d715593 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.179034003 +0000 UTC m=+3.212663540,LastTimestamp:2026-03-20 16:01:23.179034003 +0000 UTC m=+3.212663540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.268905 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980c9d7e410d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.179880717 +0000 UTC m=+3.213510254,LastTimestamp:2026-03-20 16:01:23.179880717 +0000 UTC m=+3.213510254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.272237 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980ca89003e1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.365594081 +0000 UTC m=+3.399223628,LastTimestamp:2026-03-20 16:01:23.365594081 +0000 UTC m=+3.399223628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.276029 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980ca8c0fef1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.368804081 +0000 UTC m=+3.402433618,LastTimestamp:2026-03-20 16:01:23.368804081 +0000 UTC m=+3.402433618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.279730 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980ca9a424da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.383690458 +0000 UTC m=+3.417319985,LastTimestamp:2026-03-20 16:01:23.383690458 +0000 UTC m=+3.417319985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.286315 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980ca9b23011 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.384610833 +0000 UTC m=+3.418240370,LastTimestamp:2026-03-20 16:01:23.384610833 +0000 UTC m=+3.418240370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.291666 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980ca9ede971 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.388524913 +0000 UTC m=+3.422154450,LastTimestamp:2026-03-20 16:01:23.388524913 +0000 UTC m=+3.422154450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.295274 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cb2f65dc2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.540073922 +0000 UTC m=+3.573703459,LastTimestamp:2026-03-20 16:01:23.540073922 +0000 UTC m=+3.573703459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.299136 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cb452ff67 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.562921831 +0000 UTC m=+3.596551368,LastTimestamp:2026-03-20 16:01:23.562921831 +0000 UTC m=+3.596551368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.304554 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cb4631f5e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.56397859 +0000 UTC m=+3.597608127,LastTimestamp:2026-03-20 16:01:23.56397859 +0000 UTC m=+3.597608127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.310020 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cbcb2b35f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.703411551 +0000 UTC m=+3.737041088,LastTimestamp:2026-03-20 16:01:23.703411551 +0000 UTC m=+3.737041088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.316881 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cbd69473b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.715376955 +0000 UTC m=+3.749006492,LastTimestamp:2026-03-20 16:01:23.715376955 +0000 UTC m=+3.749006492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.320732 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980cbdc2264c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.721201228 +0000 UTC m=+3.754830775,LastTimestamp:2026-03-20 16:01:23.721201228 +0000 UTC m=+3.754830775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.326025 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980cc7b44dca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.888065994 +0000 UTC m=+3.921695531,LastTimestamp:2026-03-20 16:01:23.888065994 +0000 UTC m=+3.921695531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.330212 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980cc8701b6d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.900373869 +0000 UTC m=+3.934003406,LastTimestamp:2026-03-20 16:01:23.900373869 +0000 UTC m=+3.934003406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.334305 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980cf9b990ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:24.727271598 +0000 UTC m=+4.760901175,LastTimestamp:2026-03-20 16:01:24.727271598 +0000 UTC m=+4.760901175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.338995 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d043678d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:24.90322965 +0000 UTC m=+4.936859197,LastTimestamp:2026-03-20 16:01:24.90322965 +0000 UTC m=+4.936859197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.342822 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d04d83c19 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:24.913830937 +0000 UTC m=+4.947460474,LastTimestamp:2026-03-20 16:01:24.913830937 +0000 UTC m=+4.947460474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.346065 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d04fd05fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:24.916241914 +0000 UTC m=+4.949871461,LastTimestamp:2026-03-20 16:01:24.916241914 +0000 UTC m=+4.949871461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.349596 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d0fa67da9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.095120297 +0000 UTC m=+5.128749854,LastTimestamp:2026-03-20 16:01:25.095120297 +0000 UTC m=+5.128749854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.353393 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d10afd314 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.112509204 +0000 UTC m=+5.146138751,LastTimestamp:2026-03-20 16:01:25.112509204 +0000 UTC m=+5.146138751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.357220 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d10c4a080 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.113872512 +0000 UTC m=+5.147502059,LastTimestamp:2026-03-20 16:01:25.113872512 +0000 UTC m=+5.147502059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.360606 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d1d9e225e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.329453662 +0000 UTC m=+5.363083209,LastTimestamp:2026-03-20 16:01:25.329453662 +0000 UTC m=+5.363083209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.367033 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d1e92402f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.345452079 +0000 UTC m=+5.379081646,LastTimestamp:2026-03-20 16:01:25.345452079 +0000 UTC m=+5.379081646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.370202 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d1ea2caae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.34653611 +0000 UTC m=+5.380165657,LastTimestamp:2026-03-20 16:01:25.34653611 +0000 UTC m=+5.380165657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.373531 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d294eb15a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.525573978 +0000 UTC m=+5.559203515,LastTimestamp:2026-03-20 16:01:25.525573978 +0000 UTC m=+5.559203515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.377570 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d2a0cd7a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.538035616 +0000 UTC m=+5.571665153,LastTimestamp:2026-03-20 16:01:25.538035616 +0000 UTC m=+5.571665153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.381079 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d2a2046a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.539309222 +0000 UTC m=+5.572938759,LastTimestamp:2026-03-20 16:01:25.539309222 +0000 UTC m=+5.572938759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.385671 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d34c4537f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.717832575 +0000 UTC m=+5.751462112,LastTimestamp:2026-03-20 16:01:25.717832575 +0000 UTC m=+5.751462112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.389479 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980d3572d689 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:25.729269385 +0000 UTC m=+5.762898922,LastTimestamp:2026-03-20 16:01:25.729269385 +0000 UTC m=+5.762898922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.397198 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980cb4631f5e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cb4631f5e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.56397859 +0000 UTC m=+3.597608127,LastTimestamp:2026-03-20 16:01:34.7706147 +0000 UTC m=+14.804244237,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.400967 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-apiserver-crc.189e980f5a2187eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 16:02:15 crc kubenswrapper[4675]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:02:15 crc kubenswrapper[4675]: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:34.934632427 +0000 UTC m=+14.968261964,LastTimestamp:2026-03-20 16:01:34.934632427 +0000 UTC m=+14.968261964,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.404945 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980f5a22170c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:34.934669068 +0000 UTC m=+14.968298605,LastTimestamp:2026-03-20 16:01:34.934669068 +0000 UTC m=+14.968298605,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.409116 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980f5a2187eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-apiserver-crc.189e980f5a2187eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 16:02:15 crc kubenswrapper[4675]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:02:15 crc kubenswrapper[4675]: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:34.934632427 +0000 UTC m=+14.968261964,LastTimestamp:2026-03-20 16:01:34.940468561 +0000 UTC m=+14.974098108,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.415898 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980f5a22170c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980f5a22170c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:34.934669068 +0000 UTC m=+14.968298605,LastTimestamp:2026-03-20 16:01:34.940526002 +0000 UTC m=+14.974155549,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.419824 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980cbcb2b35f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cbcb2b35f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.703411551 +0000 UTC m=+3.737041088,LastTimestamp:2026-03-20 16:01:35.028160233 +0000 UTC m=+15.061789770,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.425397 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980cbd69473b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980cbd69473b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:23.715376955 +0000 UTC m=+3.749006492,LastTimestamp:2026-03-20 16:01:35.036814886 +0000 UTC m=+15.070444423,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.430348 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-controller-manager-crc.189e980f96363a1a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:02:15 crc kubenswrapper[4675]: body: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942621722 +0000 UTC m=+15.976251319,LastTimestamp:2026-03-20 16:01:35.942621722 +0000 UTC m=+15.976251319,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.434672 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980f96377774 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942702964 +0000 UTC m=+15.976332531,LastTimestamp:2026-03-20 16:01:35.942702964 +0000 UTC m=+15.976332531,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.441584 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980f96363a1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-controller-manager-crc.189e980f96363a1a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:02:15 crc kubenswrapper[4675]: body: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942621722 +0000 UTC m=+15.976251319,LastTimestamp:2026-03-20 16:01:45.942157357 +0000 UTC m=+25.975786904,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.447806 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980f96377774\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980f96377774 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942702964 +0000 UTC m=+15.976332531,LastTimestamp:2026-03-20 16:01:45.942208199 +0000 UTC m=+25.975837736,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.453233 4675 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9811ea64ea0a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:45.944902154 +0000 UTC m=+25.978531701,LastTimestamp:2026-03-20 16:01:45.944902154 +0000 UTC m=+25.978531701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.460104 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980c522fde6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c522fde6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:21.916452461 +0000 UTC m=+1.950082018,LastTimestamp:2026-03-20 16:01:46.068944654 +0000 UTC m=+26.102574201,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.463979 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980c67170c19\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c67170c19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.267147289 +0000 UTC m=+2.300776826,LastTimestamp:2026-03-20 16:01:46.333890055 +0000 UTC m=+26.367519582,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.469412 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980c67c1328c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980c67c1328c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:22.278298252 +0000 UTC m=+2.311927789,LastTimestamp:2026-03-20 16:01:46.351341853 +0000 UTC m=+26.384971390,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.478608 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980f96363a1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-controller-manager-crc.189e980f96363a1a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:02:15 crc kubenswrapper[4675]: body: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942621722 +0000 UTC m=+15.976251319,LastTimestamp:2026-03-20 16:01:55.94288254 +0000 UTC m=+35.976512077,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.483664 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980f96377774\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980f96377774 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942702964 +0000 UTC m=+15.976332531,LastTimestamp:2026-03-20 16:01:55.942939812 +0000 UTC m=+35.976569349,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:02:15 crc kubenswrapper[4675]: E0320 16:02:15.487887 4675 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980f96363a1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:02:15 crc kubenswrapper[4675]: &Event{ObjectMeta:{kube-controller-manager-crc.189e980f96363a1a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:02:15 crc kubenswrapper[4675]: body: Mar 20 16:02:15 crc kubenswrapper[4675]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:35.942621722 +0000 UTC m=+15.976251319,LastTimestamp:2026-03-20 16:02:05.94255104 +0000 UTC m=+45.976180577,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:02:15 crc kubenswrapper[4675]: > Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.612991 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.908209 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.908469 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.909749 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.909819 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.909832 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4675]: I0320 16:02:15.913522 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.358863 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.360379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.360422 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.360431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.360456 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:16 crc kubenswrapper[4675]: E0320 16:02:16.364482 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:02:16 crc kubenswrapper[4675]: E0320 16:02:16.364694 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.611157 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.913132 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.914264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.914312 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4675]: I0320 16:02:16.914321 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4675]: I0320 16:02:17.610895 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:18 crc kubenswrapper[4675]: I0320 16:02:18.610458 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:19 crc kubenswrapper[4675]: I0320 16:02:19.610503 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:20 crc kubenswrapper[4675]: I0320 16:02:20.612907 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:20 crc kubenswrapper[4675]: E0320 16:02:20.754403 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:02:21 crc kubenswrapper[4675]: I0320 16:02:21.609855 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:22 crc kubenswrapper[4675]: I0320 16:02:22.611884 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:23 crc kubenswrapper[4675]: I0320 16:02:23.365213 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:23 crc kubenswrapper[4675]: I0320 16:02:23.366478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4675]: I0320 16:02:23.366717 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4675]: I0320 16:02:23.366808 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4675]: I0320 16:02:23.366911 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:23 crc kubenswrapper[4675]: E0320 16:02:23.370228 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:02:23 crc kubenswrapper[4675]: E0320 16:02:23.370495 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:02:23 crc kubenswrapper[4675]: I0320 16:02:23.610489 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.609371 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.673682 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.674872 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.674919 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.674932 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.675582 4675 scope.go:117] "RemoveContainer" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.933802 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.935620 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd"} Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.935857 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.936830 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.936885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4675]: I0320 16:02:24.936900 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.609370 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.940333 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.940891 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.942807 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" exitCode=255 Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.942853 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd"} Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.942892 4675 scope.go:117] "RemoveContainer" containerID="432fac9653707e4988369289925c8afc7a1da6aede3d08b393984d87629ca706" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.943038 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.944061 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.944093 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.944141 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4675]: I0320 16:02:25.944708 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:02:25 crc kubenswrapper[4675]: E0320 16:02:25.944907 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.302127 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.610000 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.946628 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.948568 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.949383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.949414 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.949427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4675]: I0320 16:02:26.949944 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:02:26 crc kubenswrapper[4675]: E0320 16:02:26.950106 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.306651 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.610602 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.950896 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.951795 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.951819 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.951827 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4675]: I0320 16:02:27.952279 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:02:27 crc kubenswrapper[4675]: E0320 16:02:27.952415 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:28 crc kubenswrapper[4675]: I0320 16:02:28.609820 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:29 crc kubenswrapper[4675]: I0320 16:02:29.610105 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:30 crc kubenswrapper[4675]: I0320 16:02:30.370451 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:30 crc kubenswrapper[4675]: I0320 16:02:30.371900 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4675]: I0320 16:02:30.371947 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4675]: I0320 16:02:30.371957 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4675]: I0320 16:02:30.371986 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:30 crc kubenswrapper[4675]: E0320 16:02:30.376905 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:02:30 crc kubenswrapper[4675]: E0320 16:02:30.377106 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:02:30 crc kubenswrapper[4675]: I0320 16:02:30.611063 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:30 crc kubenswrapper[4675]: E0320 16:02:30.754843 4675 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:02:31 crc kubenswrapper[4675]: I0320 16:02:31.611300 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:32 crc kubenswrapper[4675]: I0320 16:02:32.611423 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:32 crc kubenswrapper[4675]: I0320 16:02:32.673386 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:32 crc kubenswrapper[4675]: I0320 16:02:32.674483 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4675]: I0320 16:02:32.674543 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4675]: I0320 16:02:32.674558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4675]: I0320 16:02:33.208092 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:02:33 crc kubenswrapper[4675]: I0320 16:02:33.226471 4675 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 16:02:33 crc kubenswrapper[4675]: I0320 16:02:33.611759 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:34 crc kubenswrapper[4675]: I0320 16:02:34.613817 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:35 crc kubenswrapper[4675]: I0320 16:02:35.612261 4675 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:35 crc kubenswrapper[4675]: I0320 16:02:35.662912 4675 csr.go:261] certificate signing request csr-w9q65 is approved, waiting to be issued Mar 20 16:02:35 crc kubenswrapper[4675]: I0320 16:02:35.672537 4675 csr.go:257] certificate signing request csr-w9q65 is issued Mar 20 16:02:35 crc kubenswrapper[4675]: I0320 16:02:35.757287 4675 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 16:02:36 crc kubenswrapper[4675]: I0320 16:02:36.456563 4675 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 16:02:36 crc kubenswrapper[4675]: I0320 16:02:36.673333 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 00:39:09.215352669 +0000 UTC Mar 20 16:02:36 crc kubenswrapper[4675]: I0320 16:02:36.673394 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7112h36m32.541961771s for next certificate rotation Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.377806 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.378942 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.379013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.379034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.379227 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.386978 4675 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.387356 4675 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.387403 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.391285 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.391334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.391348 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.391366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.391378 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.405714 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.413414 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.413456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.413468 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.413486 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.413498 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.426659 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.438534 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.438592 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.438614 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.438637 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.438654 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.450761 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.458897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.458937 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.458947 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.458963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4675]: I0320 16:02:37.458974 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.468162 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.468270 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.468288 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.568499 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.669447 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.770456 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.871406 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:37 crc kubenswrapper[4675]: E0320 16:02:37.972547 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.072801 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.172975 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.273599 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.373851 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.474003 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.575138 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: I0320 16:02:38.598577 4675 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.675825 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.776807 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.877455 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:38 crc kubenswrapper[4675]: E0320 16:02:38.978080 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.078455 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.178960 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.279568 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.379941 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.480980 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.581086 4675 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.623852 4675 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.632422 4675 apiserver.go:52] "Watching apiserver" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.637401 4675 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.637832 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.638337 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.638392 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.638544 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.638866 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.639174 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.639035 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.639141 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.638979 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.640047 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.640504 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.641640 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.642201 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.642460 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.642650 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.642807 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.643161 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.643383 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.644409 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.681267 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.683338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.683369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.683378 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.683392 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.683401 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.701752 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.713530 4675 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.720183 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.735055 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.750338 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.762336 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.776420 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782349 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782403 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782458 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782494 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782525 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782571 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782681 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782733 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782826 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782878 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782926 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782939 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.782974 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783019 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783062 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783106 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783110 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783147 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783197 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783264 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783344 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783376 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783416 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783477 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783530 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783580 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783632 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783676 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783718 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783761 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783840 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783885 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783933 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.783968 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784010 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784051 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784064 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784103 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784150 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784196 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784243 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784290 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784353 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784458 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784510 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784602 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784649 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784587 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784696 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784751 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784816 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784838 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784889 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784939 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784984 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785030 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785079 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785132 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785178 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785263 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785313 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785361 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785405 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785496 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785542 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786059 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786122 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786170 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786214 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786256 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786300 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786345 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786378 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786410 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786441 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786484 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786550 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786582 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786626 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786671 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786717 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786798 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786851 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786905 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786952 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786996 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787041 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787087 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787135 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787183 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787228 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787278 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787325 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787378 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787424 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787469 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787512 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787561 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787603 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787640 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787690 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787753 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787864 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787915 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787967 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788058 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788110 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788154 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788214 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788258 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788302 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788349 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788393 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788442 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788485 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788528 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788575 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788622 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788668 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788718 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788802 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788852 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788895 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788942 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789006 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789066 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789118 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789171 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789298 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789424 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789443 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789224 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794311 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794393 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794472 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794515 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794550 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794583 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794618 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794650 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794685 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794729 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794929 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794968 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794992 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795019 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795087 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795119 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795143 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795174 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795206 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795196 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795307 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795346 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795409 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.796898 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.796952 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797106 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797145 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797321 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797370 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797440 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797478 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797510 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797542 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797620 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797653 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797693 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797792 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797828 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797861 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797953 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797987 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.798041 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.798076 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.798937 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.798989 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799024 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799393 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799450 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799487 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800194 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800273 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800318 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800709 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800798 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800843 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800888 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800929 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800974 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801008 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801050 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801085 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801123 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801159 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801196 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801231 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801308 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.784886 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801346 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.785279 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801367 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.786788 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787625 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787749 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.787900 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788147 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788267 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788455 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788513 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788563 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788589 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.788863 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789193 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789676 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789710 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.789803 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.790004 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.790163 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.790356 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.790433 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.792417 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.792434 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.792444 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.792489 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.793028 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.793065 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.793297 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.793331 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.793615 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.793849 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794057 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794060 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794100 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794107 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794318 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801912 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795084 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801950 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802412 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802497 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802505 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795706 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795732 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795855 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.796058 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.796225 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.796714 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797063 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797186 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797216 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802651 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797371 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797877 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797954 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.797952 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.798202 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.798874 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799027 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799095 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799196 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799355 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.799874 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800055 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800091 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800605 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.800812 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801247 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801414 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.794744 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802719 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802929 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.802994 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.795286 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803215 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803342 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803429 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.801386 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803645 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803744 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803818 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.803878 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.804015 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.804178 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.804290 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805418 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805429 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805511 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805522 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805570 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805928 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.805895 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.806100 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.806124 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.806454 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807239 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807267 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807293 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807326 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807426 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807537 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807615 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807688 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807990 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.807743 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808065 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808087 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808140 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808179 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808320 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808389 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808461 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808506 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808870 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808914 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808971 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809015 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809207 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809320 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809381 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.808521 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809731 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809807 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.809930 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.810041 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.810328 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.810559 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.810682 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.810711 4675 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.810729 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.810758 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.810902 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:40.310861277 +0000 UTC m=+80.344490844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811142 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811288 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811325 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.811399 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811420 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811532 4675 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811741 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811890 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.811956 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:40.311941107 +0000 UTC m=+80.345570644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.811968 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812028 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812185 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812196 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812235 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812278 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812294 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812305 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812318 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812329 4675 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812343 4675 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812353 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812363 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812376 4675 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812386 4675 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812396 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812407 4675 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812421 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812432 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812431 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812445 4675 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812479 4675 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812539 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812561 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812577 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812594 4675 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812614 4675 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812630 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812682 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812697 4675 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812716 4675 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812825 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812841 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812855 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812872 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812885 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812899 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812912 4675 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812933 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812946 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.812960 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813031 4675 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813029 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813042 4675 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813077 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813311 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813342 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813362 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813385 4675 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813405 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813424 4675 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813466 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813483 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813527 4675 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813585 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813602 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813603 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813617 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813661 4675 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813679 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813691 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813702 4675 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813715 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813732 4675 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813747 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813761 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813876 4675 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813892 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813904 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813915 4675 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813930 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813944 4675 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813958 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813972 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813988 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814002 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814014 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814025 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814039 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814052 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814065 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814080 4675 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814097 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814137 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814178 4675 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814307 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814321 4675 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814335 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814347 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814366 4675 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814378 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814389 4675 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814406 4675 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814419 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814521 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814567 4675 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814584 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814596 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814681 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814697 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813712 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813814 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.813936 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814001 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814207 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814255 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814531 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.814708 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.815275 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.815315 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.815345 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.815496 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:40.315451025 +0000 UTC m=+80.349080562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.816230 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.816238 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.817982 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.816610 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.817152 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.817337 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.818594 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.818930 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.819095 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.819297 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.819754 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.819980 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820071 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820181 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820157 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820252 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820511 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820599 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.820865 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.821121 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.821329 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.824314 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.824694 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.826967 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.830952 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.830981 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.830996 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.831021 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.831083 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:40.331053668 +0000 UTC m=+80.364683225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.831578 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.832250 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.832316 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.833410 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.833427 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.833788 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.834362 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.834981 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.834986 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.837092 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.837221 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.837447 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.837544 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.837891 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.838369 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.838997 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.839043 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.839073 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:39 crc kubenswrapper[4675]: E0320 16:02:39.839196 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:40.339146333 +0000 UTC m=+80.372775910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.839265 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.840599 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.841589 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.841579 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.842131 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.842936 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.845185 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.846812 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.847033 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.847064 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.847327 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.847885 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.848325 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.848415 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.848466 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.848705 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.848830 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.849069 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.849073 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.849921 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.855577 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.855712 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.856381 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.861837 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.862426 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.876810 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.895292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.895331 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.895340 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.895357 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.895368 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915098 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915136 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915176 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915199 4675 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915215 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915228 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915240 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915252 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915264 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915275 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915287 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915299 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915310 4675 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915351 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915369 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915381 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915393 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915404 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915439 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915435 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915452 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915529 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915546 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915560 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915571 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915581 4675 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915590 4675 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915599 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915609 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915617 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915626 4675 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915636 4675 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915648 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915659 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915667 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915676 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915684 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915692 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915701 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915712 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915723 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915732 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915741 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915750 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915760 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915794 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915805 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915816 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915826 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915837 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915851 4675 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915862 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915874 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915887 4675 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915901 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915913 4675 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915925 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915936 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915948 4675 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915960 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915971 4675 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915983 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.915997 4675 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916009 4675 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916021 4675 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916034 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916048 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916063 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916076 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916089 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916102 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916115 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916127 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916140 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916152 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916166 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916178 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916190 4675 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916202 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916213 4675 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916225 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916236 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916248 4675 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916259 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916270 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916282 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916294 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916309 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916320 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916333 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916364 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916376 4675 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916386 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916397 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916408 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916418 4675 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916431 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916445 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.916457 4675 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.963124 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.973662 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.987510 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.997983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.998025 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.998038 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.998057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4675]: I0320 16:02:39.998070 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: W0320 16:02:40.014338 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0e521cab045035bdeba0bc3149e03b4e4eb92349c0aec72e87990b01657070f6 WatchSource:0}: Error finding container 0e521cab045035bdeba0bc3149e03b4e4eb92349c0aec72e87990b01657070f6: Status 404 returned error can't find the container with id 0e521cab045035bdeba0bc3149e03b4e4eb92349c0aec72e87990b01657070f6 Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.100329 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.100385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.100396 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.100415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.100427 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.202400 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.202459 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.202470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.202487 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.202497 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.304782 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.304829 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.304859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.304874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.304884 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.320612 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.320717 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.320791 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:41.320729838 +0000 UTC m=+81.354359415 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.320846 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.320859 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.320934 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:41.320914913 +0000 UTC m=+81.354544450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.320983 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.321048 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:41.321031946 +0000 UTC m=+81.354661523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.408652 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.408738 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.408816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.408859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.408889 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.421679 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.421909 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.421940 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.421990 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.422015 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.422401 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:41.422362303 +0000 UTC m=+81.455991880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.422240 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.422474 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.422493 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.422551 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:41.422537637 +0000 UTC m=+81.456167214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.512675 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.512742 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.512754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.512796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.512812 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.616331 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.616416 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.616476 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.616515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.616541 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.679621 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.680421 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.682733 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.684574 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.686875 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.687990 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.689407 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.690996 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.691628 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.692924 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.694593 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.694906 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.695424 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.696568 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.698977 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.699695 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.700661 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.701343 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.702047 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.702719 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.703151 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.703732 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.704356 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.704590 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.704890 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.705458 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.707016 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.708163 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.708802 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.710027 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.711361 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.712362 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.713652 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.715637 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.716315 4675 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.716445 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.717901 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.718509 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720112 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720212 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720506 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.720713 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.722540 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.723242 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.724833 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.725658 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.726760 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.727275 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.728284 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.728974 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.730013 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.730502 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.731538 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.732173 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.733313 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.733848 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.734813 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.735291 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.735921 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.736475 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.737132 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.737694 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.738633 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.754246 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.769935 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.822610 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.822673 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.822689 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.822715 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.822729 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.925917 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.925969 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.925981 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.926001 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.926013 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.985423 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0e521cab045035bdeba0bc3149e03b4e4eb92349c0aec72e87990b01657070f6"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.989102 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.989188 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.989207 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d190054736a7ab3abe9b3d43c89ab588a92c97dedc3e58856f34c0cc6bc9349a"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.991955 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.992068 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0531e9da5ed984170bf508d1d7540ad17d60038d9daa1c72541e14e6faf0cd47"} Mar 20 16:02:40 crc kubenswrapper[4675]: I0320 16:02:40.992426 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:02:40 crc kubenswrapper[4675]: E0320 16:02:40.992585 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.010315 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.029025 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.029070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.029083 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.029106 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.029122 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.033749 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.054445 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.068427 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.083303 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.098834 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.115103 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.131571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.131614 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.131625 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.131645 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.131657 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.139671 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.157696 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.175862 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.197637 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.215470 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.229813 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.233963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.234049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.234064 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.234087 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.234101 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.245646 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:41Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.330404 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.330560 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.330611 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.330742 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:43.330687618 +0000 UTC m=+83.364317185 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.330788 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.330785 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.330953 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:43.330921164 +0000 UTC m=+83.364550731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.331081 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:43.331052768 +0000 UTC m=+83.364682305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.336888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.336960 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.336985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.337017 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.337041 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.432036 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.432107 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432305 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432364 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432379 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432445 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432501 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432519 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432476 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:43.432447486 +0000 UTC m=+83.466077203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.432675 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:43.432627031 +0000 UTC m=+83.466256578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.440043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.440090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.440101 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.440123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.440135 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.544646 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.544708 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.544718 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.544745 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.544797 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.647250 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.647301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.647312 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.647333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.647347 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.672925 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.673001 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.672925 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.673132 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.673746 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:41 crc kubenswrapper[4675]: E0320 16:02:41.673658 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.750367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.750420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.750433 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.750453 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.750467 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.853568 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.853633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.853650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.853674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.853692 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.956384 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.956434 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.956450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.956467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4675]: I0320 16:02:41.956478 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.059385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.059435 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.059446 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.059464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.059478 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.162733 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.162822 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.162837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.162857 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.162870 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.265911 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.265985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.266001 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.266030 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.266047 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.368561 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.368624 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.368637 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.368657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.368671 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.472133 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.472245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.472270 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.472309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.472333 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.576038 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.576143 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.576170 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.576205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.576226 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.678753 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.678826 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.678837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.678853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.678891 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.781820 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.782091 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.782176 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.782259 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.782338 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.885821 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.885873 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.885885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.885904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.885914 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.988971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.989022 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.989033 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.989048 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.989059 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4675]: I0320 16:02:42.998901 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.013572 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.034238 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.046426 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.064541 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.080706 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.091193 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.091283 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.091309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.091345 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.091367 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.101679 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.114390 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:43Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.194542 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.194601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.194617 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.194642 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.194660 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.298631 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.298716 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.298733 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.298790 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.298815 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.351454 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.351543 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.351577 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.351664 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:47.351630684 +0000 UTC m=+87.385260221 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.351689 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.351751 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:47.351731177 +0000 UTC m=+87.385360784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.351778 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.351932 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:47.351895932 +0000 UTC m=+87.385525679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.402403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.402498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.402517 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.402547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.402609 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.452540 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.452613 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.452840 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.452880 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.452907 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.452927 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.452997 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.453024 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.452999 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:47.452970001 +0000 UTC m=+87.486599578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.453136 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:47.453103155 +0000 UTC m=+87.486732732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.505746 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.505810 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.505820 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.505837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.505848 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.608645 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.608696 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.608705 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.608722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.608731 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.673191 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.673277 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.673191 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.673421 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.673557 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:43 crc kubenswrapper[4675]: E0320 16:02:43.673735 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.711710 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.711809 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.711837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.711869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.711891 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.815019 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.815138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.815164 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.815193 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.815216 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.918910 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.919013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.919035 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.919066 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4675]: I0320 16:02:43.919088 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.022759 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.022886 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.022911 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.022935 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.022956 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.125866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.125941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.125962 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.125987 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.126003 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.228424 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.228475 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.228493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.228515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.228531 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.330451 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.330485 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.330494 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.330508 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.330519 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.432444 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.432493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.432505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.432521 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.432532 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.535710 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.535754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.535779 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.535794 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.535803 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.638185 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.638256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.638279 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.638309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.638343 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.742128 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.742162 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.742171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.742188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.742200 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.844639 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.844702 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.844720 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.844744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.844763 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.947615 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.947707 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.947734 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.947803 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4675]: I0320 16:02:44.947834 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.050756 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.050832 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.050850 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.050873 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.050890 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.153159 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.153211 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.153223 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.153243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.153257 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.255355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.255403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.255421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.255441 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.255457 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.357846 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.357884 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.357896 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.357912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.357928 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.460833 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.460912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.460933 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.460959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.460976 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.563280 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.563339 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.563356 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.563382 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.563404 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.666924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.666971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.666983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.666999 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.667009 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.672689 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:45 crc kubenswrapper[4675]: E0320 16:02:45.672830 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.672875 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.672888 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:45 crc kubenswrapper[4675]: E0320 16:02:45.673052 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:45 crc kubenswrapper[4675]: E0320 16:02:45.673185 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.769762 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.769867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.769885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.769909 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.769927 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.871966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.872013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.872024 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.872040 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.872053 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.974302 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.974342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.974353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.974367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4675]: I0320 16:02:45.974377 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.077697 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.077823 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.077857 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.077887 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.077909 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.180089 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.180125 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.180134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.180149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.180160 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.283431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.283488 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.283501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.283523 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.283537 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.386058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.386110 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.386127 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.386149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.386175 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.489534 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.489585 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.489596 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.489615 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.489626 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.592370 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.592429 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.592442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.592461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.592473 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.694929 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.694979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.694992 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.695034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.695047 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.797970 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.798004 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.798012 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.798026 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.798034 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.900720 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.900757 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.900787 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.900803 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:46 crc kubenswrapper[4675]: I0320 16:02:46.900815 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:46Z","lastTransitionTime":"2026-03-20T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.002940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.002971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.002978 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.002991 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.003000 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.106287 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.106347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.106358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.106375 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.106385 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.208371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.208425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.208442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.208464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.208484 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.312086 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.312175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.312191 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.312215 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.312232 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.387884 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.387960 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.388002 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.388068 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.388111 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:55.388098759 +0000 UTC m=+95.421728296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.388418 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:55.388407347 +0000 UTC m=+95.422036884 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.388480 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.388505 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:55.38849891 +0000 UTC m=+95.422128447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.414671 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.414702 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.414714 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.414733 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.414742 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.488624 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.488686 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.488816 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.488832 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.488842 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.488885 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:55.48887337 +0000 UTC m=+95.522502907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.488953 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.489003 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.489024 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.489112 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:55.489087406 +0000 UTC m=+95.522717023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.517202 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.517242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.517251 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.517264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.517273 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.619916 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.619967 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.619976 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.619996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.620009 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.673700 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.673753 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.673799 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.674314 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.674160 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.674419 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.719324 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.719573 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.719660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.719790 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.719886 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.732190 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:47Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.736676 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.736754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.736840 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.736874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.736898 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.751314 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:47Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.755188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.755221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.755234 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.755252 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.755264 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.766899 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:47Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.770741 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.770787 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.770802 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.770818 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.770831 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.784290 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:47Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.788606 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.788655 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.788669 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.788688 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.788703 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.801990 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:47Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:47 crc kubenswrapper[4675]: E0320 16:02:47.802221 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.803929 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.803969 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.803984 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.804007 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.804021 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.906902 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.907213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.907291 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.907370 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:47 crc kubenswrapper[4675]: I0320 16:02:47.907445 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:47Z","lastTransitionTime":"2026-03-20T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.009685 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.009748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.009759 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.009792 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.009804 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.112353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.112390 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.112401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.112419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.112431 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.214421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.214490 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.214508 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.214532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.214550 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.317271 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.317345 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.317359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.317382 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.317397 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.419742 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.419782 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.419791 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.419804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.419813 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.521649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.521708 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.521721 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.521739 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.521749 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.624458 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.624495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.624503 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.624516 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.624526 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.726990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.727063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.727082 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.727117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.727135 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.829868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.829924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.829938 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.829958 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.829977 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.932895 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.932979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.933003 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.933097 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:48 crc kubenswrapper[4675]: I0320 16:02:48.933170 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:48Z","lastTransitionTime":"2026-03-20T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.036084 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.036134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.036147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.036161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.036171 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.139695 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.139866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.139896 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.139930 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.139968 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.243866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.243926 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.243941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.243964 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.243982 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.347187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.347248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.347265 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.347290 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.347307 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.449393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.449430 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.449439 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.449452 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.449481 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.552706 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.552793 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.552809 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.552836 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.552859 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.655477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.655524 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.655544 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.655572 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.655593 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.673300 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.673465 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:49 crc kubenswrapper[4675]: E0320 16:02:49.673592 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.673628 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:49 crc kubenswrapper[4675]: E0320 16:02:49.673810 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:49 crc kubenswrapper[4675]: E0320 16:02:49.673944 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.758331 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.758363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.758391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.758407 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.758417 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.861565 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.861608 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.861619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.861634 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.861646 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.964900 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.964943 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.964955 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.964971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:49 crc kubenswrapper[4675]: I0320 16:02:49.964983 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:49Z","lastTransitionTime":"2026-03-20T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.068105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.068165 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.068181 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.068205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.068218 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.171113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.171175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.171195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.171219 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.171236 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.274151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.274204 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.274220 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.274242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.274260 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.377108 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.377181 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.377203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.377233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.377254 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.480151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.480221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.480237 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.480261 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.480279 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.583383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.583452 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.583477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.583512 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.583542 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.686232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.686289 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.686319 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.686348 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.686368 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.693601 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.709726 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.725960 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.740527 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.754967 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.774388 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.788620 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.789278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.789304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.789312 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.789329 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.789341 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.892025 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.892350 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.892361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.892379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.892392 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.995247 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.995341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.995361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.995390 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:50 crc kubenswrapper[4675]: I0320 16:02:50.995410 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:50Z","lastTransitionTime":"2026-03-20T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.098633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.098682 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.098695 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.098715 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.098728 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.201574 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.201622 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.201639 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.201662 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.201679 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.304409 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.304457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.304475 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.304497 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.304515 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.407973 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.408049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.408067 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.408102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.408125 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.511339 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.511398 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.511410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.511428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.511439 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.613976 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.614055 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.614082 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.614118 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.614151 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.673372 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.673379 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.673376 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:51 crc kubenswrapper[4675]: E0320 16:02:51.673522 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:51 crc kubenswrapper[4675]: E0320 16:02:51.673670 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:51 crc kubenswrapper[4675]: E0320 16:02:51.673941 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.717178 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.717216 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.717228 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.717283 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.717294 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.820381 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.820449 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.820469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.820493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.820512 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.922648 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.922701 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.922718 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.922739 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:51 crc kubenswrapper[4675]: I0320 16:02:51.922755 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:51Z","lastTransitionTime":"2026-03-20T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.024553 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.024602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.024614 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.024630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.024641 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.129837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.129938 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.129963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.130003 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.130028 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.232177 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.232616 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.232816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.233001 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.233140 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.336423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.336717 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.336834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.336941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.337039 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.440258 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.440314 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.440323 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.440342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.440354 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.543092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.543194 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.543215 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.543233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.543248 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.644896 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.644948 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.644963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.644989 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.645008 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.748055 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.748107 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.748118 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.748160 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.748169 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.851517 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.851605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.851630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.851667 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.851685 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.955388 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.955462 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.955485 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.955532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:52 crc kubenswrapper[4675]: I0320 16:02:52.955556 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:52Z","lastTransitionTime":"2026-03-20T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.057260 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.057316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.057335 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.057397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.057426 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.160145 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.160187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.160197 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.160211 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.160221 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.262730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.262802 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.262812 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.262824 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.262833 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.365273 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.365330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.365342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.365361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.365373 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.467980 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.468014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.468024 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.468058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.468070 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.570217 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.570263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.570280 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.570301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.570314 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.673179 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.673191 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.673189 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:53 crc kubenswrapper[4675]: E0320 16:02:53.673383 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:53 crc kubenswrapper[4675]: E0320 16:02:53.673528 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:53 crc kubenswrapper[4675]: E0320 16:02:53.673934 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.674695 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.674862 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.674924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.674945 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.674974 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.674994 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: E0320 16:02:53.675151 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.777645 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.777700 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.777719 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.777743 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.777759 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.879868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.879922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.879933 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.879955 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.879967 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.981841 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.981895 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.981912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.981936 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:53 crc kubenswrapper[4675]: I0320 16:02:53.981955 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:53Z","lastTransitionTime":"2026-03-20T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.084020 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.084057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.084067 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.084082 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.084094 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.186536 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.186599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.186611 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.186644 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.186655 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.289451 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.289508 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.289520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.289537 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.289555 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.392240 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.392299 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.392309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.392323 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.392333 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.495153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.495203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.495213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.495228 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.495239 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.597690 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.597733 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.597744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.597785 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.597799 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.700151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.700178 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.700187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.700200 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.700209 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.802412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.802463 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.802475 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.802494 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.802507 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.904655 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.904712 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.904729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.904747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4675]: I0320 16:02:54.904759 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.007441 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.007483 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.007498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.007516 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.007528 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.110233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.110266 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.110274 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.110290 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.110349 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.213438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.213512 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.213537 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.213569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.213592 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.315978 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.316018 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.316028 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.316047 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.316058 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.418237 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.418296 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.418305 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.418336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.418347 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.460366 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.460462 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.460494 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.460534 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:03:11.460513543 +0000 UTC m=+111.494143090 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.460600 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.460600 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.460669 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:11.460660807 +0000 UTC m=+111.494290344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.460685 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:11.460678567 +0000 UTC m=+111.494308104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.521236 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.521275 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.521285 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.521304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.521314 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.561355 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.561499 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562220 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562250 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562279 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562293 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562307 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562319 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562395 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:11.562370523 +0000 UTC m=+111.596000090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.562437 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:11.562411365 +0000 UTC m=+111.596040942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.623405 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.623454 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.623483 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.623499 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.623508 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.672730 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.672745 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.672742 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.673106 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.673149 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:55 crc kubenswrapper[4675]: E0320 16:02:55.672940 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.726445 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.726511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.726522 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.726564 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.726578 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.829401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.829451 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.829465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.829485 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.829497 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.932171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.932233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.932252 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.932276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:55 crc kubenswrapper[4675]: I0320 16:02:55.932296 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:55Z","lastTransitionTime":"2026-03-20T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.034593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.034625 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.034633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.034646 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.034655 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.137529 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.137587 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.137605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.137629 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.137646 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.240376 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.240420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.240431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.240450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.240461 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.343607 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.343695 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.343734 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.343799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.343823 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.446045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.446099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.446115 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.446134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.446147 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.549407 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.549471 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.549488 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.549554 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.549573 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.652460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.652536 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.652551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.652570 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.652584 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.659558 4675 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.702323 4675 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.755395 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.755445 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.755471 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.755495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.755510 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.858276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.858364 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.858395 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.858423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.858445 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.961533 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.961626 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.961650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.961679 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:56 crc kubenswrapper[4675]: I0320 16:02:56.961704 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:56Z","lastTransitionTime":"2026-03-20T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.063682 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.063726 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.063737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.063754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.063781 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.165660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.165726 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.165747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.165794 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.165811 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.268205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.268254 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.268265 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.268286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.268300 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.371234 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.371292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.371308 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.371334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.371351 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.473666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.473723 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.473735 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.473751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.473791 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.576964 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.577006 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.577018 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.577034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.577046 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.673230 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.673265 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.673265 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.673526 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.673580 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.673383 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.679867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.679907 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.679922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.679941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.679951 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.782170 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.782210 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.782221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.782238 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.782251 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.832088 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.832151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.832168 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.832199 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.832215 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.847350 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.850983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.851023 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.851035 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.851050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.851061 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.864614 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.867920 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.867957 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.867966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.867983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.867993 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.881845 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.885315 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.885343 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.885352 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.885367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.885376 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.897366 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.901275 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.901311 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.901321 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.901337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.901347 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.915718 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4675]: E0320 16:02:57.915897 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.918334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.918364 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.918374 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.918391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:57 crc kubenswrapper[4675]: I0320 16:02:57.918405 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:57Z","lastTransitionTime":"2026-03-20T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.021064 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.021440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.021628 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.021820 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.021959 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.124919 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.124972 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.124989 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.125010 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.125062 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.227748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.227799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.227810 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.227825 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.227834 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.338437 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.338493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.338515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.338543 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.338563 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.442569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.442617 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.442634 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.442657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.442673 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.545039 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.545075 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.545087 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.545103 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.545115 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.648142 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.648174 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.648188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.648203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.648215 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.751477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.751514 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.751530 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.751552 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.751592 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.853589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.853616 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.853624 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.853637 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.853645 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.955723 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.955760 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.955786 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.955802 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:58 crc kubenswrapper[4675]: I0320 16:02:58.955812 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:58Z","lastTransitionTime":"2026-03-20T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.057735 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.057781 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.057789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.057801 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.057809 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.160591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.160614 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.160623 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.160636 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.160645 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.264571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.264630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.264651 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.264679 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.264699 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.367379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.367432 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.367443 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.367460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.367471 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.469984 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.470022 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.470031 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.470044 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.470053 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.572147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.572187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.572197 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.572214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.572222 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.672748 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:59 crc kubenswrapper[4675]: E0320 16:02:59.672986 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.673030 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.673111 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:59 crc kubenswrapper[4675]: E0320 16:02:59.673204 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:59 crc kubenswrapper[4675]: E0320 16:02:59.673354 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.674672 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.674710 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.674725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.674747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.674792 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.686692 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.778813 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.778876 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.778892 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.778915 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.778931 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.881563 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.881620 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.881650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.881665 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.881675 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.984110 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.984191 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.984223 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.984250 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:59 crc kubenswrapper[4675]: I0320 16:02:59.984270 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:59Z","lastTransitionTime":"2026-03-20T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.086908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.086946 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.086980 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.086999 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.087009 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.188979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.189045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.189062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.189090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.189108 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.291647 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.291714 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.291731 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.291756 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.291805 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.394540 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.394599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.394616 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.394638 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.394655 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.498263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.498312 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.498329 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.498349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.498374 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.601897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.601981 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.602005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.602038 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.602059 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.694354 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.704518 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.704663 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.704750 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.704863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.704960 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.713450 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.738561 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.757351 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.776954 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.800084 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.808112 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.808303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.808432 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.808553 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.808681 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.815727 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.826651 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.911925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.911975 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.911984 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.911998 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:00 crc kubenswrapper[4675]: I0320 16:03:00.912007 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:00Z","lastTransitionTime":"2026-03-20T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.014959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.015027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.015049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.015076 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.015139 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.117393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.117430 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.117441 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.117456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.117467 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.220822 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.220905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.220928 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.220951 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.220967 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.323505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.323538 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.323549 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.323562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.323572 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.425644 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.425722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.425733 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.425754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.425841 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.528838 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.528912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.528927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.528944 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.528955 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.630857 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.630889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.630897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.630908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.630916 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.672619 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:01 crc kubenswrapper[4675]: E0320 16:03:01.672723 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.672802 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.672922 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:01 crc kubenswrapper[4675]: E0320 16:03:01.672993 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:01 crc kubenswrapper[4675]: E0320 16:03:01.673204 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.733715 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.733746 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.733757 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.733812 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.733824 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.837119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.837206 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.837226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.837248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.837264 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.939839 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.939876 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.939892 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.939910 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:01 crc kubenswrapper[4675]: I0320 16:03:01.939923 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:01Z","lastTransitionTime":"2026-03-20T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.042959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.043029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.043052 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.043082 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.043104 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.145862 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.145994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.146013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.146037 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.146053 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.248492 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.248526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.248536 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.248554 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.248563 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.351286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.351342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.351354 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.351373 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.351386 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.453925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.454006 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.454029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.454058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.454079 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.557246 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.557315 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.557333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.557358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.557376 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.660493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.660560 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.660602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.660626 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.660644 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.764243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.764298 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.764320 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.764349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.764374 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.866908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.866978 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.867000 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.867027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.867052 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.970786 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.970850 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.970867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.970889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:02 crc kubenswrapper[4675]: I0320 16:03:02.970908 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:02Z","lastTransitionTime":"2026-03-20T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.072842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.072904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.072922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.072947 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.072963 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.175985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.176057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.176085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.176113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.176135 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.278355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.278403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.278415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.278436 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.278453 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.381292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.381327 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.381336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.381352 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.381365 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.483743 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.483842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.483863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.483893 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.483912 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.586647 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.586717 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.586741 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.586808 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.586829 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.673623 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.673652 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.673646 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:03 crc kubenswrapper[4675]: E0320 16:03:03.674267 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:03 crc kubenswrapper[4675]: E0320 16:03:03.674128 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:03 crc kubenswrapper[4675]: E0320 16:03:03.674540 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.688864 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.688922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.688941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.688965 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.688981 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.715896 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5vk6l"] Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.716501 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.718713 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.718718 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.719014 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.735242 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.754175 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.769368 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.785792 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.792043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.792085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.792095 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.792113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.792125 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.798892 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.808813 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.828626 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.840449 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.840877 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99feab4d-4648-4d25-acf1-c779dae4c9da-hosts-file\") pod \"node-resolver-5vk6l\" (UID: \"99feab4d-4648-4d25-acf1-c779dae4c9da\") " pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.840939 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr7w7\" (UniqueName: \"kubernetes.io/projected/99feab4d-4648-4d25-acf1-c779dae4c9da-kube-api-access-sr7w7\") pod \"node-resolver-5vk6l\" (UID: \"99feab4d-4648-4d25-acf1-c779dae4c9da\") " pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.860443 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:03Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.893805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.894026 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.894138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.894226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.894317 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.941464 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99feab4d-4648-4d25-acf1-c779dae4c9da-hosts-file\") pod \"node-resolver-5vk6l\" (UID: \"99feab4d-4648-4d25-acf1-c779dae4c9da\") " pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.941753 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr7w7\" (UniqueName: \"kubernetes.io/projected/99feab4d-4648-4d25-acf1-c779dae4c9da-kube-api-access-sr7w7\") pod \"node-resolver-5vk6l\" (UID: \"99feab4d-4648-4d25-acf1-c779dae4c9da\") " pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.941600 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99feab4d-4648-4d25-acf1-c779dae4c9da-hosts-file\") pod \"node-resolver-5vk6l\" (UID: \"99feab4d-4648-4d25-acf1-c779dae4c9da\") " pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.959969 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr7w7\" (UniqueName: \"kubernetes.io/projected/99feab4d-4648-4d25-acf1-c779dae4c9da-kube-api-access-sr7w7\") pod \"node-resolver-5vk6l\" (UID: \"99feab4d-4648-4d25-acf1-c779dae4c9da\") " pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.996264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.996320 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.996332 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.996349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:03 crc kubenswrapper[4675]: I0320 16:03:03.996359 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:03Z","lastTransitionTime":"2026-03-20T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.031082 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5vk6l" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.052640 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5vk6l" event={"ID":"99feab4d-4648-4d25-acf1-c779dae4c9da","Type":"ContainerStarted","Data":"a06a1347d9cb0c2034e4743be613bdb93a23f31d7f45e419cfa9c50206296c5c"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.086830 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tpfs5"] Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.087165 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xdnn9"] Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.087666 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tvqmz"] Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.088031 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.088269 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.088031 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.090485 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.091084 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.091173 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.091635 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.093442 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.093815 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.093979 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.094128 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.094441 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.094619 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.094794 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.095081 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.099079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.099202 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.099302 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.099392 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.099479 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.104987 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.120071 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.135192 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144504 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-cnibin\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144547 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-os-release\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144572 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1277d318-d05a-4621-af3f-d9237e553399-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144594 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/31f7145a-b091-4511-a3e6-0c7d380dea57-rootfs\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144616 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-cnibin\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144636 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-kubelet\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144654 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31f7145a-b091-4511-a3e6-0c7d380dea57-proxy-tls\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144677 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-system-cni-dir\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144698 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-cni-multus\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144756 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144832 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-k8s-cni-cncf-io\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144900 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-os-release\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144942 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4g7\" (UniqueName: \"kubernetes.io/projected/1277d318-d05a-4621-af3f-d9237e553399-kube-api-access-js4g7\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.144966 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-netns\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145157 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-multus-certs\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145197 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-hostroot\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145269 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-conf-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145306 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31f7145a-b091-4511-a3e6-0c7d380dea57-mcd-auth-proxy-config\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145347 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d530666-72d8-4520-a229-43eab240e5dd-cni-binary-copy\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145377 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1277d318-d05a-4621-af3f-d9237e553399-cni-binary-copy\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145406 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5cl\" (UniqueName: \"kubernetes.io/projected/31f7145a-b091-4511-a3e6-0c7d380dea57-kube-api-access-gj5cl\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145438 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-socket-dir-parent\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145479 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-cni-bin\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145625 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stlwl\" (UniqueName: \"kubernetes.io/projected/7d530666-72d8-4520-a229-43eab240e5dd-kube-api-access-stlwl\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145659 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-system-cni-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145688 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-cni-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145714 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-etc-kubernetes\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.145740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d530666-72d8-4520-a229-43eab240e5dd-multus-daemon-config\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.148137 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.163258 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.177056 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.194306 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.201832 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.201860 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.201867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.201880 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.201888 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.207305 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.220149 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.231233 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.242109 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246312 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-system-cni-dir\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246355 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-cni-multus\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246399 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246430 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-k8s-cni-cncf-io\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246441 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-system-cni-dir\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246459 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4g7\" (UniqueName: \"kubernetes.io/projected/1277d318-d05a-4621-af3f-d9237e553399-kube-api-access-js4g7\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246492 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-netns\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246521 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-multus-certs\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246560 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-os-release\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246591 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-hostroot\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246618 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-conf-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246648 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31f7145a-b091-4511-a3e6-0c7d380dea57-mcd-auth-proxy-config\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246688 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d530666-72d8-4520-a229-43eab240e5dd-cni-binary-copy\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246718 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5cl\" (UniqueName: \"kubernetes.io/projected/31f7145a-b091-4511-a3e6-0c7d380dea57-kube-api-access-gj5cl\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246748 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1277d318-d05a-4621-af3f-d9237e553399-cni-binary-copy\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246817 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-cni-bin\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246892 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stlwl\" (UniqueName: \"kubernetes.io/projected/7d530666-72d8-4520-a229-43eab240e5dd-kube-api-access-stlwl\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246922 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-socket-dir-parent\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247240 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-cni-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247273 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-etc-kubernetes\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247303 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-hostroot\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246949 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-os-release\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247181 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-socket-dir-parent\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.246719 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-cni-multus\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247305 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-system-cni-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247393 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-conf-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247411 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d530666-72d8-4520-a229-43eab240e5dd-multus-daemon-config\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247459 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-cnibin\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247490 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-os-release\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247523 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-cni-bin\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247525 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1277d318-d05a-4621-af3f-d9237e553399-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247600 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/31f7145a-b091-4511-a3e6-0c7d380dea57-rootfs\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247622 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-kubelet\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247669 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31f7145a-b091-4511-a3e6-0c7d380dea57-proxy-tls\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247693 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-cnibin\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247805 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-cnibin\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247842 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/31f7145a-b091-4511-a3e6-0c7d380dea57-rootfs\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247904 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-var-lib-kubelet\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247077 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-multus-certs\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247370 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-system-cni-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247007 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-netns\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.248327 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-etc-kubernetes\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.247025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-host-run-k8s-cni-cncf-io\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.248334 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-multus-cni-dir\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.248527 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-cnibin\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.248530 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d530666-72d8-4520-a229-43eab240e5dd-os-release\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.248752 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1277d318-d05a-4621-af3f-d9237e553399-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.248933 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1277d318-d05a-4621-af3f-d9237e553399-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.249014 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1277d318-d05a-4621-af3f-d9237e553399-cni-binary-copy\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.249089 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d530666-72d8-4520-a229-43eab240e5dd-cni-binary-copy\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.249216 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d530666-72d8-4520-a229-43eab240e5dd-multus-daemon-config\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.249559 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31f7145a-b091-4511-a3e6-0c7d380dea57-mcd-auth-proxy-config\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.260750 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.261113 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31f7145a-b091-4511-a3e6-0c7d380dea57-proxy-tls\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.263784 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5cl\" (UniqueName: \"kubernetes.io/projected/31f7145a-b091-4511-a3e6-0c7d380dea57-kube-api-access-gj5cl\") pod \"machine-config-daemon-tpfs5\" (UID: \"31f7145a-b091-4511-a3e6-0c7d380dea57\") " pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.264246 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4g7\" (UniqueName: \"kubernetes.io/projected/1277d318-d05a-4621-af3f-d9237e553399-kube-api-access-js4g7\") pod \"multus-additional-cni-plugins-xdnn9\" (UID: \"1277d318-d05a-4621-af3f-d9237e553399\") " pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.266004 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stlwl\" (UniqueName: \"kubernetes.io/projected/7d530666-72d8-4520-a229-43eab240e5dd-kube-api-access-stlwl\") pod \"multus-tvqmz\" (UID: \"7d530666-72d8-4520-a229-43eab240e5dd\") " pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.273511 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.286125 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.298559 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.304232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.304269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.304278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.304293 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.304304 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.310981 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.327096 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.337957 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.348011 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.357000 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.367255 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.377667 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.406238 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.406273 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.406282 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.406295 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.406305 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.412652 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.424011 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.430751 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tvqmz" Mar 20 16:03:04 crc kubenswrapper[4675]: W0320 16:03:04.442194 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f7145a_b091_4511_a3e6_0c7d380dea57.slice/crio-1c23af78eccb5e22e509afa7cc03104e2281d83011f6f6a067dde90f304f59bb WatchSource:0}: Error finding container 1c23af78eccb5e22e509afa7cc03104e2281d83011f6f6a067dde90f304f59bb: Status 404 returned error can't find the container with id 1c23af78eccb5e22e509afa7cc03104e2281d83011f6f6a067dde90f304f59bb Mar 20 16:03:04 crc kubenswrapper[4675]: W0320 16:03:04.450664 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d530666_72d8_4520_a229_43eab240e5dd.slice/crio-88d618a5fcf00dee492f34848d548fd312748cce21d11bf5b7f2bde447606653 WatchSource:0}: Error finding container 88d618a5fcf00dee492f34848d548fd312748cce21d11bf5b7f2bde447606653: Status 404 returned error can't find the container with id 88d618a5fcf00dee492f34848d548fd312748cce21d11bf5b7f2bde447606653 Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.457624 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n54g5"] Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.459480 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463102 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463144 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463294 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463343 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463503 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463532 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.463678 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.474088 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.496377 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.509329 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.509353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.509362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.509374 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.509382 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.511787 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.522519 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.533967 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.547443 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550237 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-kubelet\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550266 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-config\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550338 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-netns\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550359 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-ovn\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550377 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-env-overrides\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550400 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-var-lib-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550463 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-script-lib\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550543 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550569 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-slash\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550588 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-node-log\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550618 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-systemd\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550641 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-netd\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550680 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-ovn-kubernetes\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550703 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-systemd-units\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550725 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-bin\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550798 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-etc-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550825 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjgt\" (UniqueName: \"kubernetes.io/projected/467da034-edb5-4a24-a940-839cc0131c75-kube-api-access-dqjgt\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550850 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-log-socket\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550873 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.550917 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467da034-edb5-4a24-a940-839cc0131c75-ovn-node-metrics-cert\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.563749 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.582754 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.596538 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.609272 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.611886 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.611923 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.611935 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.611953 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.611969 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.621404 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.630937 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.640446 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:04Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651750 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-netns\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651809 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-ovn\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651827 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-env-overrides\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651843 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-var-lib-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651861 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-script-lib\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651902 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651919 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-slash\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651934 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-node-log\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651951 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-systemd\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651967 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-netd\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.651988 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-ovn-kubernetes\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652003 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-systemd-units\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652016 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-bin\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652031 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-etc-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652048 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjgt\" (UniqueName: \"kubernetes.io/projected/467da034-edb5-4a24-a940-839cc0131c75-kube-api-access-dqjgt\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652061 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-log-socket\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652076 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652099 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467da034-edb5-4a24-a940-839cc0131c75-ovn-node-metrics-cert\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652118 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-kubelet\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652138 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-config\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652502 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-netd\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652691 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652808 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-slash\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652885 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-ovn-kubernetes\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652896 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-config\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652957 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652993 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-systemd-units\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653022 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-bin\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653050 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-etc-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.652924 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-log-socket\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653081 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-script-lib\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653108 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-node-log\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653088 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-kubelet\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653142 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-systemd\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653157 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-netns\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653187 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-var-lib-openvswitch\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653209 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-ovn\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.653579 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-env-overrides\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.656580 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467da034-edb5-4a24-a940-839cc0131c75-ovn-node-metrics-cert\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.669158 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjgt\" (UniqueName: \"kubernetes.io/projected/467da034-edb5-4a24-a940-839cc0131c75-kube-api-access-dqjgt\") pod \"ovnkube-node-n54g5\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.713561 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.713602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.713612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.713627 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.713637 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.774347 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:04 crc kubenswrapper[4675]: W0320 16:03:04.789529 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467da034_edb5_4a24_a940_839cc0131c75.slice/crio-3c07fae4f4f4fff6a2f6afc73bb9f06aee2c0f4c46bbd2e3b42afdc96ff9191f WatchSource:0}: Error finding container 3c07fae4f4f4fff6a2f6afc73bb9f06aee2c0f4c46bbd2e3b42afdc96ff9191f: Status 404 returned error can't find the container with id 3c07fae4f4f4fff6a2f6afc73bb9f06aee2c0f4c46bbd2e3b42afdc96ff9191f Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.816271 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.816322 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.816338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.816358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.816374 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.918628 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.918664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.918674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.918691 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:04 crc kubenswrapper[4675]: I0320 16:03:04.918702 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:04Z","lastTransitionTime":"2026-03-20T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.021633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.021696 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.021704 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.021717 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.021727 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.059567 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5vk6l" event={"ID":"99feab4d-4648-4d25-acf1-c779dae4c9da","Type":"ContainerStarted","Data":"da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.062947 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvqmz" event={"ID":"7d530666-72d8-4520-a229-43eab240e5dd","Type":"ContainerStarted","Data":"457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.063013 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvqmz" event={"ID":"7d530666-72d8-4520-a229-43eab240e5dd","Type":"ContainerStarted","Data":"88d618a5fcf00dee492f34848d548fd312748cce21d11bf5b7f2bde447606653"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.069395 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" exitCode=0 Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.069510 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.069541 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"3c07fae4f4f4fff6a2f6afc73bb9f06aee2c0f4c46bbd2e3b42afdc96ff9191f"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.073904 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.073973 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.073991 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"1c23af78eccb5e22e509afa7cc03104e2281d83011f6f6a067dde90f304f59bb"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.078598 4675 generic.go:334] "Generic (PLEG): container finished" podID="1277d318-d05a-4621-af3f-d9237e553399" containerID="d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170" exitCode=0 Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.078639 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerDied","Data":"d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.078660 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerStarted","Data":"b5cfa2cd20e48089462936d44deecf85512614ed11ffc53f9ce8a7f7b3d0234c"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.080210 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.106178 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.118270 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.124944 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.125021 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.125031 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.125045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.125247 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.131188 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.144487 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.156054 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.167390 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.186417 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.201490 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.214992 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.225866 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.228665 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.228725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.228737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.228753 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.228786 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.280924 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.294725 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.327927 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.330690 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.330727 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.330736 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.330750 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.330760 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.343228 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.359653 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.372406 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.383205 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.394729 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.405044 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.420357 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.431872 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.433490 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.433521 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.433533 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.433550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.433562 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.443823 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.455113 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.469501 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.490171 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.536637 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.536675 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.536685 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.536700 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.536709 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.639470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.639788 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.639804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.639821 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.639833 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.672653 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.672732 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.672748 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:05 crc kubenswrapper[4675]: E0320 16:03:05.672883 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:05 crc kubenswrapper[4675]: E0320 16:03:05.672932 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:05 crc kubenswrapper[4675]: E0320 16:03:05.673361 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.673458 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.743240 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.743278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.743289 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.743304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.743317 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.845656 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.845693 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.845702 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.845716 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.845731 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.948324 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.948385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.948401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.948421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4675]: I0320 16:03:05.948438 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.050201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.050258 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.050275 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.050298 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.050315 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.085319 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.085415 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.085445 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.085470 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.087365 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerStarted","Data":"72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.089472 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.097192 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.109293 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.124554 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.157234 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.157286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.157299 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.157319 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.157331 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.161432 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.192386 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.209503 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.232202 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.249628 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.259308 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.259350 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.259358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.259372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.259381 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.263445 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.277128 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.286674 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.295807 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.308839 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.327791 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.362502 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.362538 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.362547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.362561 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.362570 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.464885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.464927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.464937 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.464952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.464962 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.567632 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.567669 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.567681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.567696 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.567707 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.670434 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.670480 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.670490 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.670505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.670517 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.773620 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.773666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.773677 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.773694 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.773705 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.876744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.876848 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.876869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.876892 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.876910 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.979711 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.979758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.979797 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.979816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:06 crc kubenswrapper[4675]: I0320 16:03:06.979827 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:06Z","lastTransitionTime":"2026-03-20T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.083422 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.083487 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.083505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.083529 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.083546 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.104357 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.104446 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.106352 4675 generic.go:334] "Generic (PLEG): container finished" podID="1277d318-d05a-4621-af3f-d9237e553399" containerID="72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d" exitCode=0 Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.106445 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerDied","Data":"72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.106975 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.124362 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.138860 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.151666 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.167157 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.180242 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.187111 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.187162 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.187177 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.187196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.187208 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.214426 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.230321 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.241404 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.256014 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.268331 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.279602 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.289804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.289845 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.289856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.289871 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.289882 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.292075 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.309641 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.320802 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.338740 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.351899 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.365664 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.378212 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.391408 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.392173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.392207 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.392216 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.392229 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.392239 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.406616 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.425111 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.438677 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.452117 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.466247 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.483286 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.494338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.494382 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.494394 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.494412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.494424 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.495237 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:07Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.597220 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.597276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.597292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.597316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.597332 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.673664 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.673684 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.673872 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:07 crc kubenswrapper[4675]: E0320 16:03:07.674032 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:07 crc kubenswrapper[4675]: E0320 16:03:07.674180 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:07 crc kubenswrapper[4675]: E0320 16:03:07.674303 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.699545 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.699586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.699594 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.699609 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.699618 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.802077 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.802125 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.802137 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.802156 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.802173 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.904971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.905034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.905050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.905073 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:07 crc kubenswrapper[4675]: I0320 16:03:07.905090 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:07Z","lastTransitionTime":"2026-03-20T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.007671 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.007730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.007752 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.007802 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.007820 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.109928 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.109985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.110006 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.110030 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.110048 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.115153 4675 generic.go:334] "Generic (PLEG): container finished" podID="1277d318-d05a-4621-af3f-d9237e553399" containerID="2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29" exitCode=0 Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.115247 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerDied","Data":"2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.127102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.127146 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.127157 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.127173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.127185 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.145848 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: E0320 16:03:08.147984 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.154316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.154382 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.154405 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.154435 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.154460 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: E0320 16:03:08.179270 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.183350 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.183378 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.183386 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.183399 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.183408 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.183620 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: E0320 16:03:08.198158 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.198551 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.202026 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.202063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.202079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.202099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.202114 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.211506 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: E0320 16:03:08.215471 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.221500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.221580 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.221604 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.221635 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.221658 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.223157 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: E0320 16:03:08.235577 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: E0320 16:03:08.235743 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.237402 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.238702 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.238759 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.238805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.238830 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.238849 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.258639 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.270735 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.282237 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.293820 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.308063 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.320312 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.334656 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:08Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.341087 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.341159 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.341172 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.341188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.341200 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.442934 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.442979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.442988 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.443006 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.443018 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.545544 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.545584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.545592 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.545605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.545616 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.648136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.648191 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.648209 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.648233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.648249 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.750381 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.750409 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.750417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.750429 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.750437 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.853424 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.853461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.853471 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.853484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.853497 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.956211 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.956249 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.956259 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.956276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:08 crc kubenswrapper[4675]: I0320 16:03:08.956286 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:08Z","lastTransitionTime":"2026-03-20T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.059426 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.059500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.059527 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.059555 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.059577 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.124103 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.127411 4675 generic.go:334] "Generic (PLEG): container finished" podID="1277d318-d05a-4621-af3f-d9237e553399" containerID="4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a" exitCode=0 Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.127466 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerDied","Data":"4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.152191 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.163153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.163242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.163253 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.163316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.163329 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.173405 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.193094 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.208642 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.227214 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.258406 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.267369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.267428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.267439 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.267459 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.267473 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.279411 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.298230 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.314170 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.366000 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.370680 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.370709 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.370721 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.370739 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.370754 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.379964 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.394619 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.416509 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:09Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.473992 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.474059 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.474074 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.474102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.474118 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.576651 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.577156 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.577289 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.577420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.577566 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.673872 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.673960 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:09 crc kubenswrapper[4675]: E0320 16:03:09.674065 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.673881 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:09 crc kubenswrapper[4675]: E0320 16:03:09.674153 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:09 crc kubenswrapper[4675]: E0320 16:03:09.674213 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.680685 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.681001 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.681085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.681154 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.681238 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.784074 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.784129 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.784142 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.784162 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.784177 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.887311 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.887800 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.887872 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.887939 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.887997 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.991610 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.991657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.991670 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.991688 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:09 crc kubenswrapper[4675]: I0320 16:03:09.991703 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:09Z","lastTransitionTime":"2026-03-20T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.094712 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.095266 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.095453 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.095595 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.095818 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.135648 4675 generic.go:334] "Generic (PLEG): container finished" podID="1277d318-d05a-4621-af3f-d9237e553399" containerID="cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d" exitCode=0 Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.135702 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerDied","Data":"cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.158902 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.191274 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.201429 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.202020 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.202038 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.202062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.202074 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.209630 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.227533 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.247427 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.264689 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.280988 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.295892 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.305941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.305979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.305990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.306009 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.306023 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.309539 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.328529 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.340911 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.350653 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.362453 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.400389 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rgxhl"] Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.400889 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.403559 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.404020 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.404244 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.405223 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.409371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.409404 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.409413 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.409427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.409437 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.418047 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.433001 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.447539 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.459643 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.470590 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.484161 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.511260 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.512789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.512823 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.512834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.512852 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.512863 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.530628 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.532943 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-serviceca\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.533010 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-host\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.533061 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xpf\" (UniqueName: \"kubernetes.io/projected/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-kube-api-access-c9xpf\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.543693 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.552137 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.564696 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.584740 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.594945 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.608597 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.615029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.615060 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.615070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.615084 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.615096 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.634398 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-serviceca\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.634486 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-host\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.634545 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xpf\" (UniqueName: \"kubernetes.io/projected/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-kube-api-access-c9xpf\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.634639 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-host\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.635640 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-serviceca\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.651057 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xpf\" (UniqueName: \"kubernetes.io/projected/5e7c4491-d0d1-486c-aa7e-7a439eae4f22-kube-api-access-c9xpf\") pod \"node-ca-rgxhl\" (UID: \"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\") " pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.687705 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.700012 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.721730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.721804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.721828 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.721846 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.721859 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.739939 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.752480 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rgxhl" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.766420 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: W0320 16:03:10.777487 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e7c4491_d0d1_486c_aa7e_7a439eae4f22.slice/crio-76767976494a3ba0e73907276ae7f1a2fff6f3c1cec8094ca58e0e618e428544 WatchSource:0}: Error finding container 76767976494a3ba0e73907276ae7f1a2fff6f3c1cec8094ca58e0e618e428544: Status 404 returned error can't find the container with id 76767976494a3ba0e73907276ae7f1a2fff6f3c1cec8094ca58e0e618e428544 Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.791347 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.804649 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.821927 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.823562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.823592 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.823603 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.823620 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.823634 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.834793 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.845065 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.856915 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.881438 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.903070 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.916788 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.926368 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.926703 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.926723 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.926742 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.926756 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:10Z","lastTransitionTime":"2026-03-20T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:10 crc kubenswrapper[4675]: I0320 16:03:10.934266 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:10Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.029997 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.030052 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.030066 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.030085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.030098 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.132662 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.132701 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.132713 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.132732 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.132743 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.145989 4675 generic.go:334] "Generic (PLEG): container finished" podID="1277d318-d05a-4621-af3f-d9237e553399" containerID="ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216" exitCode=0 Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.146077 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerDied","Data":"ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.156098 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.156153 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.156218 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.156245 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.160255 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rgxhl" event={"ID":"5e7c4491-d0d1-486c-aa7e-7a439eae4f22","Type":"ContainerStarted","Data":"76767976494a3ba0e73907276ae7f1a2fff6f3c1cec8094ca58e0e618e428544"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.161715 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.190809 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.203015 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.217747 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.225483 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.228885 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.229462 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.234729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.234799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.234814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.234832 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.234847 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.245447 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.260665 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.274732 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.291347 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.304429 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.322110 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.338241 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.338301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.338318 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.338662 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.338710 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.341912 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.355015 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.366813 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.380545 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.392761 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.406955 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.419706 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.436933 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.441695 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.441739 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.441753 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.441793 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.441806 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.463127 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.479757 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.497329 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.512839 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.525971 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.542213 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.542656 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.542953 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.543139 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:03:43.54309548 +0000 UTC m=+143.576725017 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.543320 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.543376 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:43.543364257 +0000 UTC m=+143.576993794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.543732 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.544056 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:43.544020626 +0000 UTC m=+143.577650363 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.544932 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.549903 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.549961 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.549974 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.549996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.550011 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.562686 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.585893 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.600243 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:11Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.643878 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.643970 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.645747 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.645852 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.645872 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.645956 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:43.645934148 +0000 UTC m=+143.679563705 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.649979 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.650013 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.650036 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.650164 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:43.650084463 +0000 UTC m=+143.683714010 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.653990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.654048 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.654071 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.654101 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.654120 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.672989 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.673035 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.673125 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.673481 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.673575 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:11 crc kubenswrapper[4675]: E0320 16:03:11.673653 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.757161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.757214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.757225 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.757243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.757269 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.859926 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.859974 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.859991 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.860015 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.860031 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.963062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.963109 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.963121 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.963137 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:11 crc kubenswrapper[4675]: I0320 16:03:11.963146 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:11Z","lastTransitionTime":"2026-03-20T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.066836 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.066908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.066931 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.066971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.066994 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.165624 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rgxhl" event={"ID":"5e7c4491-d0d1-486c-aa7e-7a439eae4f22","Type":"ContainerStarted","Data":"78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.169550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.169604 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.169619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.169638 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.169648 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.171531 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" event={"ID":"1277d318-d05a-4621-af3f-d9237e553399","Type":"ContainerStarted","Data":"07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.184761 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.204036 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.227753 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.242401 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.256446 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.270103 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.271601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.271626 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.271639 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.271657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.271668 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.283078 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.301402 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.313660 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.328011 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.343675 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.359997 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.374967 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.375021 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.375037 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.375058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.375081 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.376226 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.395713 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.413710 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.426089 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.436850 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.452954 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.467760 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.477427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.477466 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.477478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.477495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.477509 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.487162 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.497920 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.517304 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.527711 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.539727 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.554833 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.566633 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.578649 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.579877 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.579913 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.579924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.579941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.579954 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.604386 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:12Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.682273 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.682311 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.682322 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.682336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.682347 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.784990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.785042 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.785057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.785094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.785109 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.887634 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.887672 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.887683 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.887699 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.887710 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.991020 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.991059 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.991072 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.991090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:12 crc kubenswrapper[4675]: I0320 16:03:12.991102 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:12Z","lastTransitionTime":"2026-03-20T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.094371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.094468 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.094492 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.094520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.094543 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.197105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.197139 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.197147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.197159 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.197168 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.299303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.299330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.299337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.299349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.299359 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.402930 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.402962 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.402971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.402985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.402993 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.505196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.505245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.505254 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.505269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.505280 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.608798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.608840 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.608851 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.608869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.608878 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.673239 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.673286 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:13 crc kubenswrapper[4675]: E0320 16:03:13.673360 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.673293 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:13 crc kubenswrapper[4675]: E0320 16:03:13.673426 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:13 crc kubenswrapper[4675]: E0320 16:03:13.673537 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.710682 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.710716 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.710725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.710737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.710746 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.813286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.813334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.813345 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.813362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.813373 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.916146 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.916213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.916227 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.916253 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:13 crc kubenswrapper[4675]: I0320 16:03:13.916273 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:13Z","lastTransitionTime":"2026-03-20T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.019908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.019963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.019976 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.019996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.020008 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.123472 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.123523 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.123532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.123551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.123561 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.178864 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/0.log" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.182111 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c" exitCode=1 Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.182185 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.183474 4675 scope.go:117] "RemoveContainer" containerID="2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.201320 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.226131 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"message\\\":\\\"ctor *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983303 6503 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 16:03:13.983381 6503 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 16:03:13.983434 6503 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983982 6503 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 16:03:13.984011 6503 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 16:03:13.984034 6503 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:03:13.984039 6503 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:03:13.984060 6503 factory.go:656] Stopping watch factory\\\\nI0320 16:03:13.984076 6503 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:13.984107 6503 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:13.984103 6503 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 16:03:13.984129 6503 handler.go:208] Removed *v1.Pod event handler 3\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.227191 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.227313 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.227341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.227376 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.227399 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.239128 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.252908 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.272192 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.283134 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.299096 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.323116 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.330417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.330459 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.330470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.330487 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.330497 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.344757 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.360820 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.384745 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.400437 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.414002 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.427290 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.432831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.432900 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.432913 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.432927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.432964 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.535940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.535975 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.535983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.535998 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.536007 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.638370 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.638419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.638432 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.638448 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.638459 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.740498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.740546 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.740569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.740586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.740599 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.842890 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.842920 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.842930 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.842945 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.842954 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.945751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.945793 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.945802 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.945814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:14 crc kubenswrapper[4675]: I0320 16:03:14.945822 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:14Z","lastTransitionTime":"2026-03-20T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.048149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.048220 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.048242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.048269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.048287 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.151438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.151476 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.151484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.151502 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.151512 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.189313 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/0.log" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.192701 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.193631 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.210133 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.221938 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.234039 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.250320 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.253979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.254057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.254077 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.254101 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.254115 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.271876 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"message\\\":\\\"ctor *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983303 6503 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 16:03:13.983381 6503 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 16:03:13.983434 6503 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983982 6503 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 16:03:13.984011 6503 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 16:03:13.984034 6503 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:03:13.984039 6503 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:03:13.984060 6503 factory.go:656] Stopping watch factory\\\\nI0320 16:03:13.984076 6503 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:13.984107 6503 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:13.984103 6503 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 16:03:13.984129 6503 handler.go:208] Removed *v1.Pod event handler 3\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.280637 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.296134 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.308926 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.319041 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.329060 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.343471 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.356311 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.356360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.356369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.356383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.356393 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.362887 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.377119 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.388482 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.458923 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.458950 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.458959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.458973 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.458981 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.563162 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.563256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.563281 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.563316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.563339 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.666596 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.666640 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.666649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.666664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.666674 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.672965 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.673048 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:15 crc kubenswrapper[4675]: E0320 16:03:15.673079 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:15 crc kubenswrapper[4675]: E0320 16:03:15.673145 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.673042 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:15 crc kubenswrapper[4675]: E0320 16:03:15.673221 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.770814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.770856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.770865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.770882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.770892 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.874707 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.874800 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.874814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.874839 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.874853 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.978917 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.978975 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.978994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.979021 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4675]: I0320 16:03:15.979039 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.082450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.082515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.082531 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.082555 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.082574 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.185492 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.185559 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.185578 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.185601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.185617 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.197239 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/1.log" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.198217 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/0.log" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.200796 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7" exitCode=1 Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.200840 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.200900 4675 scope.go:117] "RemoveContainer" containerID="2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.201329 4675 scope.go:117] "RemoveContainer" containerID="d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7" Mar 20 16:03:16 crc kubenswrapper[4675]: E0320 16:03:16.201472 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.217857 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.234722 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.248043 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.261111 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.278421 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.288994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.289044 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.289059 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.289084 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.289099 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.302024 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.305975 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.319536 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.332072 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.343330 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.356102 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.369632 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.384932 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.392290 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.392380 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.392407 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.392440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.392465 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.409965 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"message\\\":\\\"ctor *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983303 6503 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 16:03:13.983381 6503 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 16:03:13.983434 6503 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983982 6503 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 16:03:13.984011 6503 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 16:03:13.984034 6503 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:03:13.984039 6503 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:03:13.984060 6503 factory.go:656] Stopping watch factory\\\\nI0320 16:03:13.984076 6503 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:13.984107 6503 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:13.984103 6503 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 16:03:13.984129 6503 handler.go:208] Removed *v1.Pod event handler 3\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.422756 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.443691 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.458668 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.460990 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n"] Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.461481 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.463642 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.464255 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.489935 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.494706 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.494738 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.494751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.494790 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.494808 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.499660 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57beb770-7d25-4973-bfe4-27e249cd1a54-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.499740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57beb770-7d25-4973-bfe4-27e249cd1a54-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.499824 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57beb770-7d25-4973-bfe4-27e249cd1a54-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.499863 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rcl8\" (UniqueName: \"kubernetes.io/projected/57beb770-7d25-4973-bfe4-27e249cd1a54-kube-api-access-5rcl8\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.507996 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.518998 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.529925 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.582063 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.600967 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57beb770-7d25-4973-bfe4-27e249cd1a54-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.601068 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57beb770-7d25-4973-bfe4-27e249cd1a54-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.601160 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57beb770-7d25-4973-bfe4-27e249cd1a54-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.601209 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rcl8\" (UniqueName: \"kubernetes.io/projected/57beb770-7d25-4973-bfe4-27e249cd1a54-kube-api-access-5rcl8\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603007 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603072 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603089 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603122 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603139 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603434 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57beb770-7d25-4973-bfe4-27e249cd1a54-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.603985 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57beb770-7d25-4973-bfe4-27e249cd1a54-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.616266 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"message\\\":\\\"ctor *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983303 6503 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 16:03:13.983381 6503 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 16:03:13.983434 6503 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983982 6503 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 16:03:13.984011 6503 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 16:03:13.984034 6503 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:03:13.984039 6503 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:03:13.984060 6503 factory.go:656] Stopping watch factory\\\\nI0320 16:03:13.984076 6503 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:13.984107 6503 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:13.984103 6503 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 16:03:13.984129 6503 handler.go:208] Removed *v1.Pod event handler 3\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.618517 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57beb770-7d25-4973-bfe4-27e249cd1a54-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.625092 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rcl8\" (UniqueName: \"kubernetes.io/projected/57beb770-7d25-4973-bfe4-27e249cd1a54-kube-api-access-5rcl8\") pod \"ovnkube-control-plane-749d76644c-8pp4n\" (UID: \"57beb770-7d25-4973-bfe4-27e249cd1a54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.630564 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.646138 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.662054 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.682144 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.701439 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.705624 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.705692 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.705708 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.705734 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.705805 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.721491 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.740178 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.754051 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.769239 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.784342 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.784487 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" Mar 20 16:03:16 crc kubenswrapper[4675]: W0320 16:03:16.804486 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57beb770_7d25_4973_bfe4_27e249cd1a54.slice/crio-60d8ffb458b0028c218dc85d67ecf647d18f78341eb4c04deacd39fa72e5060f WatchSource:0}: Error finding container 60d8ffb458b0028c218dc85d67ecf647d18f78341eb4c04deacd39fa72e5060f: Status 404 returned error can't find the container with id 60d8ffb458b0028c218dc85d67ecf647d18f78341eb4c04deacd39fa72e5060f Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.807789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.807823 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.807835 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.807853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.807893 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.822729 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2788aa9339ca6911399e1bc1f4133048ec55c78765959b785cb17d7a8cb7e87c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"message\\\":\\\"ctor *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983303 6503 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 16:03:13.983381 6503 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 16:03:13.983434 6503 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:03:13.983982 6503 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 16:03:13.984011 6503 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 16:03:13.984034 6503 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:03:13.984039 6503 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:03:13.984060 6503 factory.go:656] Stopping watch factory\\\\nI0320 16:03:13.984076 6503 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:13.984107 6503 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:13.984103 6503 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 16:03:13.984129 6503 handler.go:208] Removed *v1.Pod event handler 3\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.838431 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.858251 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.878874 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.899507 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.912316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.912367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.912380 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.912622 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.912656 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:16Z","lastTransitionTime":"2026-03-20T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.914786 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.937470 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.953585 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.976738 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4675]: I0320 16:03:16.993843 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.010633 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.014872 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.014916 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.014929 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.014947 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.014960 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.118146 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.118182 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.118193 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.118210 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.118222 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.206797 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" event={"ID":"57beb770-7d25-4973-bfe4-27e249cd1a54","Type":"ContainerStarted","Data":"4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.206850 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" event={"ID":"57beb770-7d25-4973-bfe4-27e249cd1a54","Type":"ContainerStarted","Data":"60d8ffb458b0028c218dc85d67ecf647d18f78341eb4c04deacd39fa72e5060f"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.208844 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/1.log" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.213941 4675 scope.go:117] "RemoveContainer" containerID="d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.214151 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.220935 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.220980 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.220990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.221008 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.221022 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.227812 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mrjmp"] Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.229492 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.229632 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.239216 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.252486 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.272285 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.302673 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.308118 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqsr\" (UniqueName: \"kubernetes.io/projected/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-kube-api-access-wmqsr\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.308269 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.331047 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.331085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.331094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.331112 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.331122 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.337130 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.365946 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.387183 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.403325 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.409395 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqsr\" (UniqueName: \"kubernetes.io/projected/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-kube-api-access-wmqsr\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.409466 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.409612 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.409685 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:17.909664058 +0000 UTC m=+117.943293595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.421168 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.428729 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqsr\" (UniqueName: \"kubernetes.io/projected/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-kube-api-access-wmqsr\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.433877 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.433983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.433994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.434013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.434028 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.438818 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.462448 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.483661 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.505722 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.519978 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.530429 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.536451 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.536504 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.536522 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.536554 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.536572 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.540189 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.554680 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.564271 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.574353 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.588901 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.600681 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.613710 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.637887 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.639330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.639361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.639369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.639383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.639394 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.653866 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.667266 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.672674 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.672674 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.672736 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.672788 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.672844 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.672924 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.681266 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.699527 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.718380 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.731158 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.742198 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.742274 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.742302 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.742334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.742357 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.746296 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.770426 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:17Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.845612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.845700 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.845719 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.845744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.845785 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.915039 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.915245 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:17 crc kubenswrapper[4675]: E0320 16:03:17.915374 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:18.915339532 +0000 UTC m=+118.948969109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.949282 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.949340 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.949351 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.949376 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:17 crc kubenswrapper[4675]: I0320 16:03:17.949389 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:17Z","lastTransitionTime":"2026-03-20T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.052532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.052600 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.052612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.052633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.052650 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.157722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.157837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.157859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.157888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.157909 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.231083 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" event={"ID":"57beb770-7d25-4973-bfe4-27e249cd1a54","Type":"ContainerStarted","Data":"f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.252858 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.261501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.261600 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.261622 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.261647 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.261664 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.290368 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.309610 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.330005 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.354601 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.364796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.364870 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.364885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.364915 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.364933 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.371953 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.389183 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.412481 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.427003 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.443255 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.461059 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.461095 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.461105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.461117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.461126 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.474633 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.478968 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.483006 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.483029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.483038 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.483051 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.483061 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.495871 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.502870 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.508366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.508438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.508462 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.508492 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.508517 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.512386 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.525469 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.530107 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.533104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.533137 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.533150 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.533165 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.533176 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.538274 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.550403 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.550728 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.553333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.553355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.553365 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.553379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.553390 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.571064 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:18Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.571304 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.573314 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.573348 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.573361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.573393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.573410 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.675154 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.675378 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.677827 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.677861 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.677878 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.677901 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.677916 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.782518 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.782569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.782582 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.782603 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.782618 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.886092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.886155 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.886174 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.886200 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.886217 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.933733 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.934060 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:18 crc kubenswrapper[4675]: E0320 16:03:18.934207 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:20.934175907 +0000 UTC m=+120.967805474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.989062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.989152 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.989177 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.989201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:18 crc kubenswrapper[4675]: I0320 16:03:18.989219 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:18Z","lastTransitionTime":"2026-03-20T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.186065 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.186111 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.186122 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.186141 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.186151 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.288650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.288698 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.288716 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.288739 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.288759 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.392127 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.392203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.392224 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.392252 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.392279 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.495824 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.496117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.496133 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.496151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.496190 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.598500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.598545 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.598560 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.598579 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.598592 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.673071 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:19 crc kubenswrapper[4675]: E0320 16:03:19.673204 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.673576 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.673067 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:19 crc kubenswrapper[4675]: E0320 16:03:19.673928 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:19 crc kubenswrapper[4675]: E0320 16:03:19.673922 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.682381 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.701372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.701409 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.701420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.701435 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.701447 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.804396 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.804471 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.804492 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.804526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.804548 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.907495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.907569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.907587 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.907612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:19 crc kubenswrapper[4675]: I0320 16:03:19.907631 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:19Z","lastTransitionTime":"2026-03-20T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.011401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.011470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.011492 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.011525 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.011546 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:20Z","lastTransitionTime":"2026-03-20T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.114571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.114642 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.114666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.114698 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.114721 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:20Z","lastTransitionTime":"2026-03-20T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.218035 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.218117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.218140 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.218169 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.218189 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:20Z","lastTransitionTime":"2026-03-20T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.321447 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.321521 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.321541 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.321567 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.321587 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:20Z","lastTransitionTime":"2026-03-20T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.424363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.424421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.424443 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.424468 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.424486 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:20Z","lastTransitionTime":"2026-03-20T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.527438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.527506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.527522 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.527548 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.527566 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:20Z","lastTransitionTime":"2026-03-20T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:20 crc kubenswrapper[4675]: E0320 16:03:20.628240 4675 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.673111 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:20 crc kubenswrapper[4675]: E0320 16:03:20.673978 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.694932 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.710243 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.746378 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.766483 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: E0320 16:03:20.774262 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.778858 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.794455 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.814727 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.833830 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.855094 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.867292 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.878840 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.892580 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.906261 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.921279 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.934624 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.945230 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:20 crc kubenswrapper[4675]: I0320 16:03:20.960129 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:20Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:21 crc kubenswrapper[4675]: I0320 16:03:21.004328 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:21 crc kubenswrapper[4675]: E0320 16:03:21.004516 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:21 crc kubenswrapper[4675]: E0320 16:03:21.004593 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:25.00457521 +0000 UTC m=+125.038204757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:21 crc kubenswrapper[4675]: I0320 16:03:21.673069 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:21 crc kubenswrapper[4675]: I0320 16:03:21.673188 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:21 crc kubenswrapper[4675]: E0320 16:03:21.673334 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:21 crc kubenswrapper[4675]: I0320 16:03:21.673122 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:21 crc kubenswrapper[4675]: E0320 16:03:21.673522 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:21 crc kubenswrapper[4675]: E0320 16:03:21.673908 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:22 crc kubenswrapper[4675]: I0320 16:03:22.673869 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:22 crc kubenswrapper[4675]: E0320 16:03:22.674031 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:23 crc kubenswrapper[4675]: I0320 16:03:23.672823 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:23 crc kubenswrapper[4675]: I0320 16:03:23.672905 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:23 crc kubenswrapper[4675]: E0320 16:03:23.672983 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:23 crc kubenswrapper[4675]: E0320 16:03:23.673124 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:23 crc kubenswrapper[4675]: I0320 16:03:23.673320 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:23 crc kubenswrapper[4675]: E0320 16:03:23.673508 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:24 crc kubenswrapper[4675]: I0320 16:03:24.674019 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:24 crc kubenswrapper[4675]: E0320 16:03:24.674255 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:25 crc kubenswrapper[4675]: I0320 16:03:25.043032 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:25 crc kubenswrapper[4675]: E0320 16:03:25.043163 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:25 crc kubenswrapper[4675]: E0320 16:03:25.043531 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:33.043512703 +0000 UTC m=+133.077142250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:25 crc kubenswrapper[4675]: I0320 16:03:25.673342 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:25 crc kubenswrapper[4675]: I0320 16:03:25.673384 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:25 crc kubenswrapper[4675]: I0320 16:03:25.673401 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:25 crc kubenswrapper[4675]: E0320 16:03:25.673575 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:25 crc kubenswrapper[4675]: E0320 16:03:25.673670 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:25 crc kubenswrapper[4675]: E0320 16:03:25.673918 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:25 crc kubenswrapper[4675]: E0320 16:03:25.776150 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:26 crc kubenswrapper[4675]: I0320 16:03:26.673372 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:26 crc kubenswrapper[4675]: E0320 16:03:26.673580 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:27 crc kubenswrapper[4675]: I0320 16:03:27.672814 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:27 crc kubenswrapper[4675]: I0320 16:03:27.672758 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:27 crc kubenswrapper[4675]: I0320 16:03:27.672880 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:27 crc kubenswrapper[4675]: E0320 16:03:27.673619 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:27 crc kubenswrapper[4675]: E0320 16:03:27.673792 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:27 crc kubenswrapper[4675]: E0320 16:03:27.673943 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:27 crc kubenswrapper[4675]: I0320 16:03:27.674154 4675 scope.go:117] "RemoveContainer" containerID="d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.268416 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/1.log" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.272631 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12"} Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.273199 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.293744 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.312218 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.323953 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.342391 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.367511 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.378765 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.399865 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.414199 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.431744 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.443056 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.463650 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.478146 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.490313 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.504255 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.519264 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.537947 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.551422 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.673090 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.673260 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.700075 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.700134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.700150 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.700175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.700195 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:28Z","lastTransitionTime":"2026-03-20T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.719154 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.723780 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.723828 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.723842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.723858 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.723871 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:28Z","lastTransitionTime":"2026-03-20T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.744284 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.748816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.748868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.748885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.748906 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.748921 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:28Z","lastTransitionTime":"2026-03-20T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.764316 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.768589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.768628 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.768640 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.768654 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.768664 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:28Z","lastTransitionTime":"2026-03-20T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.781825 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.785255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.785291 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.785304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.785323 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:28 crc kubenswrapper[4675]: I0320 16:03:28.785336 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:28Z","lastTransitionTime":"2026-03-20T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.802844 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:28Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:28 crc kubenswrapper[4675]: E0320 16:03:28.803005 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.280032 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/2.log" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.280937 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/1.log" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.284999 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12" exitCode=1 Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.285054 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12"} Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.285111 4675 scope.go:117] "RemoveContainer" containerID="d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.287161 4675 scope.go:117] "RemoveContainer" containerID="e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12" Mar 20 16:03:29 crc kubenswrapper[4675]: E0320 16:03:29.287652 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.315440 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.334340 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.361386 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.377401 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.391536 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.405982 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.426975 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.450151 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.485366 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d051cf2d7cb97456b73091544886860e2c2642c6f4a0b678796f666844298dd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"olumn _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108658 6673 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:15.108705 6673 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:15.108751 6673 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:15.108820 6673 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:15.108912 6673 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.501744 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.523550 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.541935 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.563849 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.585379 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.604351 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.622615 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.649614 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:29Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.673211 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.673257 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:29 crc kubenswrapper[4675]: I0320 16:03:29.673337 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:29 crc kubenswrapper[4675]: E0320 16:03:29.673398 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:29 crc kubenswrapper[4675]: E0320 16:03:29.673572 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:29 crc kubenswrapper[4675]: E0320 16:03:29.673645 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.292643 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/2.log" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.299938 4675 scope.go:117] "RemoveContainer" containerID="e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12" Mar 20 16:03:30 crc kubenswrapper[4675]: E0320 16:03:30.300278 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.318093 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.340250 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.374686 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.396395 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.420815 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.439497 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.453945 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.471700 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.492448 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.512213 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.538017 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.557033 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.573679 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.591678 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.607484 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.623576 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.639962 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.672739 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:30 crc kubenswrapper[4675]: E0320 16:03:30.672968 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.694420 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.714592 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.734703 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.752852 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.776323 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: E0320 16:03:30.777263 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.792063 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.806299 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.837533 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.860209 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.879675 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.894852 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.911509 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.927228 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.944904 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:30 crc kubenswrapper[4675]: I0320 16:03:30.964899 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:31 crc kubenswrapper[4675]: I0320 16:03:31.000393 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:30Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:31 crc kubenswrapper[4675]: I0320 16:03:31.017331 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:31Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:31 crc kubenswrapper[4675]: I0320 16:03:31.673332 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:31 crc kubenswrapper[4675]: I0320 16:03:31.673373 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:31 crc kubenswrapper[4675]: I0320 16:03:31.673441 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:31 crc kubenswrapper[4675]: E0320 16:03:31.673603 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:31 crc kubenswrapper[4675]: E0320 16:03:31.673821 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:31 crc kubenswrapper[4675]: E0320 16:03:31.673905 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:32 crc kubenswrapper[4675]: I0320 16:03:32.673885 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:32 crc kubenswrapper[4675]: E0320 16:03:32.674226 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:32 crc kubenswrapper[4675]: I0320 16:03:32.683981 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 16:03:33 crc kubenswrapper[4675]: I0320 16:03:33.079892 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:33 crc kubenswrapper[4675]: E0320 16:03:33.080096 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:33 crc kubenswrapper[4675]: E0320 16:03:33.080207 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:49.080181052 +0000 UTC m=+149.113810629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:33 crc kubenswrapper[4675]: I0320 16:03:33.673257 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:33 crc kubenswrapper[4675]: I0320 16:03:33.673300 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:33 crc kubenswrapper[4675]: I0320 16:03:33.673296 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:33 crc kubenswrapper[4675]: E0320 16:03:33.673420 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:33 crc kubenswrapper[4675]: E0320 16:03:33.673563 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:33 crc kubenswrapper[4675]: E0320 16:03:33.673637 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:34 crc kubenswrapper[4675]: I0320 16:03:34.673222 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:34 crc kubenswrapper[4675]: E0320 16:03:34.673439 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:35 crc kubenswrapper[4675]: I0320 16:03:35.673051 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:35 crc kubenswrapper[4675]: I0320 16:03:35.673184 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:35 crc kubenswrapper[4675]: I0320 16:03:35.673065 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:35 crc kubenswrapper[4675]: E0320 16:03:35.673225 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:35 crc kubenswrapper[4675]: E0320 16:03:35.673414 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:35 crc kubenswrapper[4675]: E0320 16:03:35.673510 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:35 crc kubenswrapper[4675]: E0320 16:03:35.779265 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:36 crc kubenswrapper[4675]: I0320 16:03:36.673625 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:36 crc kubenswrapper[4675]: E0320 16:03:36.674182 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:37 crc kubenswrapper[4675]: I0320 16:03:37.673382 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:37 crc kubenswrapper[4675]: I0320 16:03:37.673458 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:37 crc kubenswrapper[4675]: E0320 16:03:37.673523 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:37 crc kubenswrapper[4675]: E0320 16:03:37.673731 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:37 crc kubenswrapper[4675]: I0320 16:03:37.673820 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:37 crc kubenswrapper[4675]: E0320 16:03:37.674000 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:38 crc kubenswrapper[4675]: I0320 16:03:38.673046 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:38 crc kubenswrapper[4675]: E0320 16:03:38.673261 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.110689 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.110736 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.110751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.110792 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.110805 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:39Z","lastTransitionTime":"2026-03-20T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.124242 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.128053 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.128086 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.128099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.128119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.128133 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:39Z","lastTransitionTime":"2026-03-20T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.142819 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.146922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.146963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.146976 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.146993 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.147005 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:39Z","lastTransitionTime":"2026-03-20T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.167887 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.172058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.172097 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.172108 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.172124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.172136 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:39Z","lastTransitionTime":"2026-03-20T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.185141 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.188848 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.188912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.188936 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.188965 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.188989 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:39Z","lastTransitionTime":"2026-03-20T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.209032 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.209184 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.673709 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.673751 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.673888 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:39 crc kubenswrapper[4675]: I0320 16:03:39.673816 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.674164 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:39 crc kubenswrapper[4675]: E0320 16:03:39.673996 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.673442 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:40 crc kubenswrapper[4675]: E0320 16:03:40.673754 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.691497 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.709906 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.724651 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.738247 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.758757 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: E0320 16:03:40.780445 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.780359 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.793538 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.813264 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.833091 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.849113 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.869533 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.880198 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.895096 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.905695 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.917293 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.929747 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.945832 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:40 crc kubenswrapper[4675]: I0320 16:03:40.955571 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:40Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:41 crc kubenswrapper[4675]: I0320 16:03:41.673089 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:41 crc kubenswrapper[4675]: I0320 16:03:41.673146 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:41 crc kubenswrapper[4675]: I0320 16:03:41.673189 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:41 crc kubenswrapper[4675]: E0320 16:03:41.673324 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:41 crc kubenswrapper[4675]: E0320 16:03:41.673448 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:41 crc kubenswrapper[4675]: E0320 16:03:41.673581 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:42 crc kubenswrapper[4675]: I0320 16:03:42.673843 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:42 crc kubenswrapper[4675]: E0320 16:03:42.674034 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.592481 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.592620 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.592705 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:47.59266512 +0000 UTC m=+207.626294697 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.592808 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.592761 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.592864 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.592975 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:47.592957159 +0000 UTC m=+207.626586726 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.593080 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:47.593057241 +0000 UTC m=+207.626686858 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.673005 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.673304 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.673427 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.673487 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.673586 4675 scope.go:117] "RemoveContainer" containerID="e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.673610 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.673635 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.673806 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.693888 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:43 crc kubenswrapper[4675]: I0320 16:03:43.693970 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694053 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694072 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694085 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694087 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694099 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694099 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694158 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:47.694138702 +0000 UTC m=+207.727768239 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:43 crc kubenswrapper[4675]: E0320 16:03:43.694181 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:47.694172573 +0000 UTC m=+207.727802110 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:44 crc kubenswrapper[4675]: I0320 16:03:44.674171 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:44 crc kubenswrapper[4675]: E0320 16:03:44.674347 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:45 crc kubenswrapper[4675]: I0320 16:03:45.673259 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:45 crc kubenswrapper[4675]: I0320 16:03:45.673307 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:45 crc kubenswrapper[4675]: E0320 16:03:45.673491 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:45 crc kubenswrapper[4675]: I0320 16:03:45.673961 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:45 crc kubenswrapper[4675]: E0320 16:03:45.674097 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:45 crc kubenswrapper[4675]: E0320 16:03:45.673532 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:45 crc kubenswrapper[4675]: E0320 16:03:45.782583 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:46 crc kubenswrapper[4675]: I0320 16:03:46.673168 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:46 crc kubenswrapper[4675]: E0320 16:03:46.673381 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:47 crc kubenswrapper[4675]: I0320 16:03:47.673652 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:47 crc kubenswrapper[4675]: I0320 16:03:47.673755 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:47 crc kubenswrapper[4675]: I0320 16:03:47.673679 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:47 crc kubenswrapper[4675]: E0320 16:03:47.673924 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:47 crc kubenswrapper[4675]: E0320 16:03:47.674062 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:47 crc kubenswrapper[4675]: E0320 16:03:47.674197 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:48 crc kubenswrapper[4675]: I0320 16:03:48.673701 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:48 crc kubenswrapper[4675]: E0320 16:03:48.674138 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:48 crc kubenswrapper[4675]: I0320 16:03:48.688526 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.154825 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.155174 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.155390 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.155310493 +0000 UTC m=+181.188940080 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.512591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.512649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.512660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.512678 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.512692 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:49Z","lastTransitionTime":"2026-03-20T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.540254 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:49Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.546300 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.546605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.546805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.547034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.547239 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:49Z","lastTransitionTime":"2026-03-20T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.569944 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:49Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.575728 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.575763 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.575796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.575825 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.575838 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:49Z","lastTransitionTime":"2026-03-20T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.599381 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:49Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.604718 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.604750 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.604760 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.604791 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.604804 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:49Z","lastTransitionTime":"2026-03-20T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.619170 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:49Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.622849 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.622905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.622919 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.622937 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.622949 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:49Z","lastTransitionTime":"2026-03-20T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.637005 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:49Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.637123 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.673443 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.673552 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.673615 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.673707 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:49 crc kubenswrapper[4675]: I0320 16:03:49.673560 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:49 crc kubenswrapper[4675]: E0320 16:03:49.673870 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.672897 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:50 crc kubenswrapper[4675]: E0320 16:03:50.673076 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.693655 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.711361 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.727887 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.739918 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.754888 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.769230 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.781375 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: E0320 16:03:50.784442 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.794510 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.805886 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.817083 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.832459 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.860234 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.880297 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.905797 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.922409 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.935604 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.949682 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.965256 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:50 crc kubenswrapper[4675]: I0320 16:03:50.979236 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:50Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:51 crc kubenswrapper[4675]: I0320 16:03:51.673536 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:51 crc kubenswrapper[4675]: I0320 16:03:51.673636 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:51 crc kubenswrapper[4675]: E0320 16:03:51.673711 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:51 crc kubenswrapper[4675]: E0320 16:03:51.673860 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:51 crc kubenswrapper[4675]: I0320 16:03:51.673536 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:51 crc kubenswrapper[4675]: E0320 16:03:51.674035 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.382528 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/0.log" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.382567 4675 generic.go:334] "Generic (PLEG): container finished" podID="7d530666-72d8-4520-a229-43eab240e5dd" containerID="457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d" exitCode=1 Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.382594 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvqmz" event={"ID":"7d530666-72d8-4520-a229-43eab240e5dd","Type":"ContainerDied","Data":"457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d"} Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.382945 4675 scope.go:117] "RemoveContainer" containerID="457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.412151 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.433524 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.450286 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.466931 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.480212 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.495496 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.506498 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.516951 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.528375 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.542404 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:51Z\\\",\\\"message\\\":\\\"2026-03-20T16:03:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37\\\\n2026-03-20T16:03:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37 to /host/opt/cni/bin/\\\\n2026-03-20T16:03:06Z [verbose] multus-daemon started\\\\n2026-03-20T16:03:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.566732 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.578005 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.589972 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.607790 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.621030 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.634393 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.650533 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.671393 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.673825 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:52 crc kubenswrapper[4675]: E0320 16:03:52.674025 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:52 crc kubenswrapper[4675]: I0320 16:03:52.687342 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:52Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.390900 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/0.log" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.390973 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvqmz" event={"ID":"7d530666-72d8-4520-a229-43eab240e5dd","Type":"ContainerStarted","Data":"e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4"} Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.412302 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.429649 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.446902 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.463204 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.484753 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.505960 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.520856 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.538333 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.553139 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.572932 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:51Z\\\",\\\"message\\\":\\\"2026-03-20T16:03:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37\\\\n2026-03-20T16:03:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37 to /host/opt/cni/bin/\\\\n2026-03-20T16:03:06Z [verbose] multus-daemon started\\\\n2026-03-20T16:03:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.603095 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.616438 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.633712 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.648077 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.667041 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.673100 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.673115 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:53 crc kubenswrapper[4675]: E0320 16:03:53.673230 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.673217 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:53 crc kubenswrapper[4675]: E0320 16:03:53.673318 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:53 crc kubenswrapper[4675]: E0320 16:03:53.673443 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.682402 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.697515 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.709418 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:53 crc kubenswrapper[4675]: I0320 16:03:53.730984 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:54 crc kubenswrapper[4675]: I0320 16:03:54.672991 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:54 crc kubenswrapper[4675]: E0320 16:03:54.673127 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:54 crc kubenswrapper[4675]: I0320 16:03:54.674684 4675 scope.go:117] "RemoveContainer" containerID="e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.399962 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/2.log" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.403817 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.404232 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.435689 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.447658 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.458172 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.475244 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:51Z\\\",\\\"message\\\":\\\"2026-03-20T16:03:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37\\\\n2026-03-20T16:03:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37 to /host/opt/cni/bin/\\\\n2026-03-20T16:03:06Z [verbose] multus-daemon started\\\\n2026-03-20T16:03:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.491490 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.506250 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.525041 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.547201 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.560970 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.577126 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.587297 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.599122 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.614740 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.629164 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.641384 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.652469 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.663179 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.673626 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.673666 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:55 crc kubenswrapper[4675]: E0320 16:03:55.673756 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.673884 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:55 crc kubenswrapper[4675]: E0320 16:03:55.674057 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:55 crc kubenswrapper[4675]: E0320 16:03:55.674536 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.681154 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: I0320 16:03:55.698449 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:55 crc kubenswrapper[4675]: E0320 16:03:55.786215 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.412351 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/3.log" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.413295 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/2.log" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.417821 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" exitCode=1 Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.417885 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.417982 4675 scope.go:117] "RemoveContainer" containerID="e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.419132 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:03:56 crc kubenswrapper[4675]: E0320 16:03:56.419672 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.439102 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.462938 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:51Z\\\",\\\"message\\\":\\\"2026-03-20T16:03:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37\\\\n2026-03-20T16:03:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37 to /host/opt/cni/bin/\\\\n2026-03-20T16:03:06Z [verbose] multus-daemon started\\\\n2026-03-20T16:03:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.497614 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1634d722f36770a2687adc0e860122ff77d123c675d16c9559bcda8da12db12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:28Z\\\",\\\"message\\\":\\\"\\\\nI0320 16:03:28.568414 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 16:03:28.568696 6922 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"default/kubernetes\\\\\\\"}\\\\nI0320 16:03:28.568720 6922 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 3.309082ms\\\\nI0320 16:03:28.568730 6922 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-image-registry/image-registry-operator. OVN-Kubernetes controller took 0.105086991 seconds. No OVN measurement.\\\\nI0320 16:03:28.568756 6922 services_controller.go:356] Processing sync for service openshift-machine-api/cluster-autoscaler-operator for network=default\\\\nI0320 16:03:28.568064 6922 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 16:03:28.568849 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:55.551247 7226 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 16:03:55.551294 7226 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 16:03:55.551315 7226 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 16:03:55.551396 7226 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 16:03:55.551466 7226 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:55.551984 7226 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:55.552057 7226 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:55.552082 7226 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:55.552111 7226 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:55.552211 7226 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.514035 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.537371 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.557376 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.571759 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.586419 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.599367 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.610961 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.621859 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.638476 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.654410 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.665495 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.673496 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:56 crc kubenswrapper[4675]: E0320 16:03:56.673708 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.676558 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.692507 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.704427 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.722305 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:56 crc kubenswrapper[4675]: I0320 16:03:56.732296 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.423788 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/3.log" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.429000 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:03:57 crc kubenswrapper[4675]: E0320 16:03:57.429220 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.452314 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.469542 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.489633 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.500889 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.514868 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.528848 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.546956 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:51Z\\\",\\\"message\\\":\\\"2026-03-20T16:03:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37\\\\n2026-03-20T16:03:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37 to /host/opt/cni/bin/\\\\n2026-03-20T16:03:06Z [verbose] multus-daemon started\\\\n2026-03-20T16:03:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.569257 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:55.551247 7226 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 16:03:55.551294 7226 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 16:03:55.551315 7226 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 16:03:55.551396 7226 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 16:03:55.551466 7226 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:55.551984 7226 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:55.552057 7226 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:55.552082 7226 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:55.552111 7226 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:55.552211 7226 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.583340 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.599427 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.612628 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.630379 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.644114 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.660320 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.671837 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.673000 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.673068 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:57 crc kubenswrapper[4675]: E0320 16:03:57.673155 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.673219 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:57 crc kubenswrapper[4675]: E0320 16:03:57.673271 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:57 crc kubenswrapper[4675]: E0320 16:03:57.673559 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.684435 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.705221 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.718674 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:57 crc kubenswrapper[4675]: I0320 16:03:57.730826 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:58 crc kubenswrapper[4675]: I0320 16:03:58.673725 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:03:58 crc kubenswrapper[4675]: E0320 16:03:58.673955 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.673733 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.673830 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.673877 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:59 crc kubenswrapper[4675]: E0320 16:03:59.673977 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:59 crc kubenswrapper[4675]: E0320 16:03:59.674081 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:59 crc kubenswrapper[4675]: E0320 16:03:59.674181 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.921698 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.921761 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.921816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.921841 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.921859 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:59Z","lastTransitionTime":"2026-03-20T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:59 crc kubenswrapper[4675]: E0320 16:03:59.942344 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:59Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.946909 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.946963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.946980 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.947004 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.947025 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:59Z","lastTransitionTime":"2026-03-20T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:59 crc kubenswrapper[4675]: E0320 16:03:59.967376 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:59Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.972364 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.972412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.972427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.972449 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.972469 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:59Z","lastTransitionTime":"2026-03-20T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:59 crc kubenswrapper[4675]: E0320 16:03:59.992698 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:59Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.998325 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.998376 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.998396 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.998430 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:59 crc kubenswrapper[4675]: I0320 16:03:59.998455 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:59Z","lastTransitionTime":"2026-03-20T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:04:00 crc kubenswrapper[4675]: E0320 16:04:00.019033 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.025195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.025262 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.025322 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.025353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.025374 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:04:00Z","lastTransitionTime":"2026-03-20T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:04:00 crc kubenswrapper[4675]: E0320 16:04:00.045746 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:04:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b6ea2d08-1bcd-4bd6-a8d1-fd4f2c962c77\\\",\\\"systemUUID\\\":\\\"3fb7a7bb-d55d-430f-9fb9-3c580cf224f3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: E0320 16:04:00.046115 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.673440 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:00 crc kubenswrapper[4675]: E0320 16:04:00.673563 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.703905 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9880c775-c68c-4318-baf5-f0eaf207db4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab8897192bb0c95b63345fb2e23feab1501d9139adc0265542c92f1e580705fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b2605f8fc0b1ce1eaf6f1682ba239e9f7ae068abb05e332ebe81842a28c7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d91c63778bfcb6ca204f0edb2b04f7245e85fc6d5737419a4933f5a98837ac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e910367433ae9ac135fa5e2090cf2e66c766c5f076cfb9272283712a7ffa3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93aa81c806253638d02b92756e692351d2d58610bc5e7bd5b2e2198911c7b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f75e3f83eabe1d4e9fcf3edc6a5e3759a26360e531f3ccfd1dac66282ea98c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd438668fe3f0e35192fbcfd7e4e5fdb08f48d598d9b8267754eb422155e1e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae49874e2645ce71c8c624381d42c80fb0bbcfe9f45132a1ae1f6d658a50fc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.717586 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e1f82c7-b739-4c4a-a633-26b6f2b68da8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:02:25.303070 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:02:25.303237 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:02:25.304236 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2132139842/tls.crt::/tmp/serving-cert-2132139842/tls.key\\\\\\\"\\\\nI0320 16:02:25.521474 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:02:25.525494 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:02:25.525604 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:02:25.525653 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:02:25.525666 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:02:25.532833 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:02:25.532850 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:02:25.532860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532866 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:02:25.532871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:02:25.532875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:02:25.532879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:02:25.532883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:02:25.538961 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.730427 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.747205 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31c6170-8f41-4549-8a21-8b7937abbe8e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fedfca76f56b8aa5245fbdc21b0e92771e7ba361688ec92ce47ac089a92704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76862fdfe4f5a58c60262580df1ba12f2c8ed9a202f23fe569ce6ede40b31e0b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:01:23.008078 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:01:23.010437 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:01:23.056920 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:01:23.060796 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:45.948372 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:45.948499 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:45Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6439fcfdce7281f4f0698a14aa8e96321ec8bb53ac1531e81fbb2b6024f3fa90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a09083bcf966b7dcd58a2fc8d7bb340d2688b4ec7a04d2966db58986dd90819\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.758283 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf48f8bb-f683-4ab6-a76b-4f319e4f386f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a316761c915bad1fbf5d94d583865467aeb08590a5dd5de1373ce8a0d2ab3a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca80c80b4aba0cd5a40d5f53b1933e6b11b52616923dfe566c867c304c522055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fac5b726d3b7ffc1feb9848c055c002c61778c537f696c635debbaf4c386f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82988df1e51fa0c58d1fc8116e9a7b4e1abd6988d6569f78d737c354910fc17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.769872 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257e0264775bf0ded5927adf84eb7454e710f244d42e2de8563fad9a46817604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ec92432d1e782753898c0dcddc51d0f4300941061c5a21211e8263235cba02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.783530 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vk6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99feab4d-4648-4d25-acf1-c779dae4c9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da22483a17dd6570a20a87474a386237d211527e585770afcb63e65df4f7d477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sr7w7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vk6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: E0320 16:04:00.787194 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.799978 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31f7145a-b091-4511-a3e6-0c7d380dea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941386fdb209aafb5bddf7ce7129e7e86b89447bceb3216fbcf55e0b852e6a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5cl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpfs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.812790 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c246bea-d43d-4cc4-bcb4-c2f629ee43ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e9bc86d055250b56a649c4d16eccfba9ce4e06b1154a49845719dc8b48d5589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f8a9ddba6b44c2af6bb35b591470ba59a1c40e284280b74f3f81b16fbe17ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:01:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:01:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.832014 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tvqmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d530666-72d8-4520-a229-43eab240e5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:51Z\\\",\\\"message\\\":\\\"2026-03-20T16:03:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37\\\\n2026-03-20T16:03:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_47708be2-1cc9-4791-8c0b-bf8d8f472c37 to /host/opt/cni/bin/\\\\n2026-03-20T16:03:06Z [verbose] multus-daemon started\\\\n2026-03-20T16:03:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-stlwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tvqmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.858600 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467da034-edb5-4a24-a940-839cc0131c75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:55Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 16:03:55.551247 7226 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 16:03:55.551294 7226 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 16:03:55.551315 7226 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 16:03:55.551396 7226 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 16:03:55.551466 7226 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:55.551984 7226 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:55.552057 7226 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:55.552082 7226 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:55.552111 7226 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:55.552211 7226 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqjgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n54g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.879169 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgxhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e7c4491-d0d1-486c-aa7e-7a439eae4f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c5cbba9fb449042fe48401360356cfc20b6d0bdbba9ff9cc9c405c13c55dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9xpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgxhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.892116 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmqsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mrjmp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.912549 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfd2377e33d279931d39c1c88031195102d2379ec074fa000988d0dca0ff1683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.925812 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.937111 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.949146 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c599c423f04911b264f5b684c03860cca5bca78cde26cb56a2776b34a26e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.962589 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1277d318-d05a-4621-af3f-d9237e553399\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07378c36908c3101c34dc22a4f7d65bdd144bca5eb0bea122b491213f22ae4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d746cc104f91028e0a3cb76c7528798d3732cae8620009c2b9a14a5e63ac2170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72608d9c610dee15aaa6a41fc4e6e28c57645027dbcdc8ce65cff8ec3289659d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1989e26c85a417fe3ddd52dc2536e8a593a43db59c55f8c59810fdf329db29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4667209a4d6d44cf709da27aad92da984a50d490c0d60492e3b9166a52cb261a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb396685535d696340875a7b94addc167708a1cd01ff76787d146c984597151d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecb2e7cbfa4e19876d28f66bbe7fbb948664462571dac2631caa7ddf18679216\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:03:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js4g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xdnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:00 crc kubenswrapper[4675]: I0320 16:04:00.972987 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57beb770-7d25-4973-bfe4-27e249cd1a54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4faff660cfc9b3b8fb0987837a55186b1ed38aa7e7dab15f35f6c2183c59600e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b2d5e0433f975b58b47743a70050d1c1fe568d375ff171b37e4fdc70203714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:03:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8pp4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:04:00Z is after 2025-08-24T17:21:41Z" Mar 20 16:04:01 crc kubenswrapper[4675]: I0320 16:04:01.673369 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:01 crc kubenswrapper[4675]: I0320 16:04:01.673462 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:01 crc kubenswrapper[4675]: I0320 16:04:01.673375 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:01 crc kubenswrapper[4675]: E0320 16:04:01.673599 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:01 crc kubenswrapper[4675]: E0320 16:04:01.673725 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:01 crc kubenswrapper[4675]: E0320 16:04:01.673895 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:02 crc kubenswrapper[4675]: I0320 16:04:02.673710 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:02 crc kubenswrapper[4675]: E0320 16:04:02.673953 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:03 crc kubenswrapper[4675]: I0320 16:04:03.673628 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:03 crc kubenswrapper[4675]: E0320 16:04:03.673789 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:03 crc kubenswrapper[4675]: I0320 16:04:03.673879 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:03 crc kubenswrapper[4675]: I0320 16:04:03.673965 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:03 crc kubenswrapper[4675]: E0320 16:04:03.674087 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:03 crc kubenswrapper[4675]: E0320 16:04:03.674131 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:04 crc kubenswrapper[4675]: I0320 16:04:04.673886 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:04 crc kubenswrapper[4675]: E0320 16:04:04.674129 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:05 crc kubenswrapper[4675]: I0320 16:04:05.673399 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:05 crc kubenswrapper[4675]: I0320 16:04:05.673473 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:05 crc kubenswrapper[4675]: E0320 16:04:05.674016 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:05 crc kubenswrapper[4675]: I0320 16:04:05.673508 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:05 crc kubenswrapper[4675]: E0320 16:04:05.674099 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:05 crc kubenswrapper[4675]: E0320 16:04:05.674229 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:05 crc kubenswrapper[4675]: E0320 16:04:05.788666 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:06 crc kubenswrapper[4675]: I0320 16:04:06.673648 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:06 crc kubenswrapper[4675]: E0320 16:04:06.673863 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:07 crc kubenswrapper[4675]: I0320 16:04:07.673638 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:07 crc kubenswrapper[4675]: I0320 16:04:07.673693 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:07 crc kubenswrapper[4675]: I0320 16:04:07.673638 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:07 crc kubenswrapper[4675]: E0320 16:04:07.673852 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:07 crc kubenswrapper[4675]: E0320 16:04:07.673974 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:07 crc kubenswrapper[4675]: E0320 16:04:07.674094 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:08 crc kubenswrapper[4675]: I0320 16:04:08.673162 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:08 crc kubenswrapper[4675]: E0320 16:04:08.673358 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:09 crc kubenswrapper[4675]: I0320 16:04:09.673327 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:09 crc kubenswrapper[4675]: E0320 16:04:09.673873 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:09 crc kubenswrapper[4675]: I0320 16:04:09.673340 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:09 crc kubenswrapper[4675]: I0320 16:04:09.673321 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:09 crc kubenswrapper[4675]: E0320 16:04:09.673969 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:09 crc kubenswrapper[4675]: E0320 16:04:09.674090 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.104596 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.104662 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.104681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.104706 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.104725 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:04:10Z","lastTransitionTime":"2026-03-20T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.189278 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr"] Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.189897 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.192007 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.192476 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.192889 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.193755 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.239548 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.239512364 podStartE2EDuration="38.239512364s" podCreationTimestamp="2026-03-20 16:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.239467563 +0000 UTC m=+170.273097100" watchObservedRunningTime="2026-03-20 16:04:10.239512364 +0000 UTC m=+170.273141951" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.239959 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=51.239942747 podStartE2EDuration="51.239942747s" podCreationTimestamp="2026-03-20 16:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.219913798 +0000 UTC m=+170.253543375" watchObservedRunningTime="2026-03-20 16:04:10.239942747 +0000 UTC m=+170.273572334" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.279282 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podStartSLOduration=109.279255063 podStartE2EDuration="1m49.279255063s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.279197151 +0000 UTC m=+170.312826688" watchObservedRunningTime="2026-03-20 16:04:10.279255063 +0000 UTC m=+170.312884610" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.279751 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.279820 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.279992 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.280012 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5vk6l" podStartSLOduration=109.279999064 podStartE2EDuration="1m49.279999064s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.265701311 +0000 UTC m=+170.299330918" watchObservedRunningTime="2026-03-20 16:04:10.279999064 +0000 UTC m=+170.313628621" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.280031 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.280137 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.309006 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.308979482 podStartE2EDuration="22.308979482s" podCreationTimestamp="2026-03-20 16:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.29299857 +0000 UTC m=+170.326628147" watchObservedRunningTime="2026-03-20 16:04:10.308979482 +0000 UTC m=+170.342609029" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.309142 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tvqmz" podStartSLOduration=109.309137176 podStartE2EDuration="1m49.309137176s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.30855967 +0000 UTC m=+170.342189257" watchObservedRunningTime="2026-03-20 16:04:10.309137176 +0000 UTC m=+170.342766733" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.374014 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rgxhl" podStartSLOduration=109.373994531 podStartE2EDuration="1m49.373994531s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.373251609 +0000 UTC m=+170.406881186" watchObservedRunningTime="2026-03-20 16:04:10.373994531 +0000 UTC m=+170.407624078" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.380693 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.380749 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.380811 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.380838 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.380863 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.380913 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.381057 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.381941 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.386255 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.402610 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50621ddb-42ee-4ded-b59f-2a679cfe5ddd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qqdfr\" (UID: \"50621ddb-42ee-4ded-b59f-2a679cfe5ddd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.475662 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xdnn9" podStartSLOduration=109.475646718 podStartE2EDuration="1m49.475646718s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.47536583 +0000 UTC m=+170.508995377" watchObservedRunningTime="2026-03-20 16:04:10.475646718 +0000 UTC m=+170.509276255" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.488290 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8pp4n" podStartSLOduration=108.488269623 podStartE2EDuration="1m48.488269623s" podCreationTimestamp="2026-03-20 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.486759159 +0000 UTC m=+170.520388706" watchObservedRunningTime="2026-03-20 16:04:10.488269623 +0000 UTC m=+170.521899160" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.509004 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.531851 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.531834842 podStartE2EDuration="1m11.531834842s" podCreationTimestamp="2026-03-20 16:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.530217625 +0000 UTC m=+170.563847162" watchObservedRunningTime="2026-03-20 16:04:10.531834842 +0000 UTC m=+170.565464389" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.550116 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.55009893 podStartE2EDuration="1m30.55009893s" podCreationTimestamp="2026-03-20 16:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:10.549080301 +0000 UTC m=+170.582709838" watchObservedRunningTime="2026-03-20 16:04:10.55009893 +0000 UTC m=+170.583728467" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.673809 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:10 crc kubenswrapper[4675]: E0320 16:04:10.674597 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.716558 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 16:04:10 crc kubenswrapper[4675]: I0320 16:04:10.724021 4675 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 16:04:10 crc kubenswrapper[4675]: E0320 16:04:10.790361 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.477802 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" event={"ID":"50621ddb-42ee-4ded-b59f-2a679cfe5ddd","Type":"ContainerStarted","Data":"57c128f879e804965b020415d14f4801f6644ba7c3e97e416892dfd62b10cc39"} Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.477858 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" event={"ID":"50621ddb-42ee-4ded-b59f-2a679cfe5ddd","Type":"ContainerStarted","Data":"79cef8f558f18ab2cb630d3739ef477e83f3e436f3e026d9f5f5491a130a8003"} Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.494862 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qqdfr" podStartSLOduration=110.494815921 podStartE2EDuration="1m50.494815921s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:11.493170684 +0000 UTC m=+171.526800251" watchObservedRunningTime="2026-03-20 16:04:11.494815921 +0000 UTC m=+171.528445498" Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.672906 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.673047 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:11 crc kubenswrapper[4675]: E0320 16:04:11.673138 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.673154 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:11 crc kubenswrapper[4675]: E0320 16:04:11.673457 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:11 crc kubenswrapper[4675]: E0320 16:04:11.673512 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:11 crc kubenswrapper[4675]: I0320 16:04:11.674016 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:04:11 crc kubenswrapper[4675]: E0320 16:04:11.674174 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:04:12 crc kubenswrapper[4675]: I0320 16:04:12.673501 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:12 crc kubenswrapper[4675]: E0320 16:04:12.673963 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:13 crc kubenswrapper[4675]: I0320 16:04:13.673134 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:13 crc kubenswrapper[4675]: I0320 16:04:13.673207 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:13 crc kubenswrapper[4675]: I0320 16:04:13.673243 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:13 crc kubenswrapper[4675]: E0320 16:04:13.673313 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:13 crc kubenswrapper[4675]: E0320 16:04:13.673456 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:13 crc kubenswrapper[4675]: E0320 16:04:13.673710 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:14 crc kubenswrapper[4675]: I0320 16:04:14.673629 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:14 crc kubenswrapper[4675]: E0320 16:04:14.673936 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:15 crc kubenswrapper[4675]: I0320 16:04:15.673384 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:15 crc kubenswrapper[4675]: I0320 16:04:15.673429 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:15 crc kubenswrapper[4675]: I0320 16:04:15.673378 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:15 crc kubenswrapper[4675]: E0320 16:04:15.673596 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:15 crc kubenswrapper[4675]: E0320 16:04:15.673821 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:15 crc kubenswrapper[4675]: E0320 16:04:15.673929 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:15 crc kubenswrapper[4675]: E0320 16:04:15.792121 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:16 crc kubenswrapper[4675]: I0320 16:04:16.672947 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:16 crc kubenswrapper[4675]: E0320 16:04:16.673178 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:17 crc kubenswrapper[4675]: I0320 16:04:17.672933 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:17 crc kubenswrapper[4675]: I0320 16:04:17.673043 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:17 crc kubenswrapper[4675]: E0320 16:04:17.673216 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:17 crc kubenswrapper[4675]: I0320 16:04:17.672956 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:17 crc kubenswrapper[4675]: E0320 16:04:17.674061 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:17 crc kubenswrapper[4675]: E0320 16:04:17.674298 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:18 crc kubenswrapper[4675]: I0320 16:04:18.673030 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:18 crc kubenswrapper[4675]: E0320 16:04:18.673211 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:19 crc kubenswrapper[4675]: I0320 16:04:19.673431 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:19 crc kubenswrapper[4675]: I0320 16:04:19.673522 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:19 crc kubenswrapper[4675]: I0320 16:04:19.673547 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:19 crc kubenswrapper[4675]: E0320 16:04:19.673637 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:19 crc kubenswrapper[4675]: E0320 16:04:19.673847 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:19 crc kubenswrapper[4675]: E0320 16:04:19.674069 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:20 crc kubenswrapper[4675]: I0320 16:04:20.673816 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:20 crc kubenswrapper[4675]: E0320 16:04:20.675466 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:20 crc kubenswrapper[4675]: E0320 16:04:20.793039 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:21 crc kubenswrapper[4675]: I0320 16:04:21.196371 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:21 crc kubenswrapper[4675]: E0320 16:04:21.196684 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:04:21 crc kubenswrapper[4675]: E0320 16:04:21.197104 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs podName:dfd7e79e-d566-4cfc-80b0-b8ff3a489837 nodeName:}" failed. No retries permitted until 2026-03-20 16:05:25.197069847 +0000 UTC m=+245.230699424 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs") pod "network-metrics-daemon-mrjmp" (UID: "dfd7e79e-d566-4cfc-80b0-b8ff3a489837") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:04:21 crc kubenswrapper[4675]: I0320 16:04:21.673704 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:21 crc kubenswrapper[4675]: I0320 16:04:21.673756 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:21 crc kubenswrapper[4675]: E0320 16:04:21.673980 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:21 crc kubenswrapper[4675]: E0320 16:04:21.674200 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:21 crc kubenswrapper[4675]: I0320 16:04:21.674291 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:21 crc kubenswrapper[4675]: E0320 16:04:21.674425 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:22 crc kubenswrapper[4675]: I0320 16:04:22.673273 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:22 crc kubenswrapper[4675]: E0320 16:04:22.673471 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:23 crc kubenswrapper[4675]: I0320 16:04:23.673195 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:23 crc kubenswrapper[4675]: I0320 16:04:23.673227 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:23 crc kubenswrapper[4675]: I0320 16:04:23.673174 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:23 crc kubenswrapper[4675]: E0320 16:04:23.673608 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:23 crc kubenswrapper[4675]: E0320 16:04:23.673687 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:23 crc kubenswrapper[4675]: E0320 16:04:23.673842 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:23 crc kubenswrapper[4675]: I0320 16:04:23.674037 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:04:23 crc kubenswrapper[4675]: E0320 16:04:23.674255 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n54g5_openshift-ovn-kubernetes(467da034-edb5-4a24-a940-839cc0131c75)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" Mar 20 16:04:24 crc kubenswrapper[4675]: I0320 16:04:24.673445 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:24 crc kubenswrapper[4675]: E0320 16:04:24.673645 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:25 crc kubenswrapper[4675]: I0320 16:04:25.672931 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:25 crc kubenswrapper[4675]: I0320 16:04:25.672987 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:25 crc kubenswrapper[4675]: I0320 16:04:25.673021 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:25 crc kubenswrapper[4675]: E0320 16:04:25.673071 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:25 crc kubenswrapper[4675]: E0320 16:04:25.673126 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:25 crc kubenswrapper[4675]: E0320 16:04:25.673207 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:25 crc kubenswrapper[4675]: E0320 16:04:25.794694 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:26 crc kubenswrapper[4675]: I0320 16:04:26.673587 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:26 crc kubenswrapper[4675]: E0320 16:04:26.674192 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:27 crc kubenswrapper[4675]: I0320 16:04:27.673653 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:27 crc kubenswrapper[4675]: I0320 16:04:27.673687 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:27 crc kubenswrapper[4675]: I0320 16:04:27.673916 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:27 crc kubenswrapper[4675]: E0320 16:04:27.674146 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:27 crc kubenswrapper[4675]: E0320 16:04:27.674348 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:27 crc kubenswrapper[4675]: E0320 16:04:27.674496 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:28 crc kubenswrapper[4675]: I0320 16:04:28.673847 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:28 crc kubenswrapper[4675]: E0320 16:04:28.674033 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:29 crc kubenswrapper[4675]: I0320 16:04:29.672899 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:29 crc kubenswrapper[4675]: I0320 16:04:29.672992 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:29 crc kubenswrapper[4675]: E0320 16:04:29.673104 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:29 crc kubenswrapper[4675]: I0320 16:04:29.673159 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:29 crc kubenswrapper[4675]: E0320 16:04:29.673339 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:29 crc kubenswrapper[4675]: E0320 16:04:29.673482 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:30 crc kubenswrapper[4675]: I0320 16:04:30.673656 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:30 crc kubenswrapper[4675]: E0320 16:04:30.675681 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:30 crc kubenswrapper[4675]: E0320 16:04:30.795837 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:31 crc kubenswrapper[4675]: I0320 16:04:31.673441 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:31 crc kubenswrapper[4675]: I0320 16:04:31.673435 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:31 crc kubenswrapper[4675]: I0320 16:04:31.674158 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:31 crc kubenswrapper[4675]: E0320 16:04:31.674515 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:31 crc kubenswrapper[4675]: E0320 16:04:31.674627 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:31 crc kubenswrapper[4675]: E0320 16:04:31.675183 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:32 crc kubenswrapper[4675]: I0320 16:04:32.673281 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:32 crc kubenswrapper[4675]: E0320 16:04:32.673857 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:33 crc kubenswrapper[4675]: I0320 16:04:33.673352 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:33 crc kubenswrapper[4675]: I0320 16:04:33.673415 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:33 crc kubenswrapper[4675]: I0320 16:04:33.673367 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:33 crc kubenswrapper[4675]: E0320 16:04:33.673488 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:33 crc kubenswrapper[4675]: E0320 16:04:33.673648 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:33 crc kubenswrapper[4675]: E0320 16:04:33.673826 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:34 crc kubenswrapper[4675]: I0320 16:04:34.672718 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:34 crc kubenswrapper[4675]: E0320 16:04:34.672942 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:35 crc kubenswrapper[4675]: I0320 16:04:35.673685 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:35 crc kubenswrapper[4675]: I0320 16:04:35.673844 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:35 crc kubenswrapper[4675]: E0320 16:04:35.673846 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:35 crc kubenswrapper[4675]: I0320 16:04:35.673873 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:35 crc kubenswrapper[4675]: E0320 16:04:35.674179 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:35 crc kubenswrapper[4675]: E0320 16:04:35.674358 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:35 crc kubenswrapper[4675]: I0320 16:04:35.674621 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:04:35 crc kubenswrapper[4675]: E0320 16:04:35.797980 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:36 crc kubenswrapper[4675]: I0320 16:04:36.563800 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/3.log" Mar 20 16:04:36 crc kubenswrapper[4675]: I0320 16:04:36.567252 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerStarted","Data":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} Mar 20 16:04:36 crc kubenswrapper[4675]: I0320 16:04:36.567740 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:04:36 crc kubenswrapper[4675]: I0320 16:04:36.595319 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podStartSLOduration=135.595301608 podStartE2EDuration="2m15.595301608s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:36.595270267 +0000 UTC m=+196.628899804" watchObservedRunningTime="2026-03-20 16:04:36.595301608 +0000 UTC m=+196.628931145" Mar 20 16:04:36 crc kubenswrapper[4675]: I0320 16:04:36.672835 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:36 crc kubenswrapper[4675]: E0320 16:04:36.672989 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:36 crc kubenswrapper[4675]: I0320 16:04:36.836183 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mrjmp"] Mar 20 16:04:37 crc kubenswrapper[4675]: I0320 16:04:37.570471 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:37 crc kubenswrapper[4675]: E0320 16:04:37.570626 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:37 crc kubenswrapper[4675]: I0320 16:04:37.672960 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:37 crc kubenswrapper[4675]: I0320 16:04:37.673112 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:37 crc kubenswrapper[4675]: E0320 16:04:37.673221 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:37 crc kubenswrapper[4675]: I0320 16:04:37.673275 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:37 crc kubenswrapper[4675]: E0320 16:04:37.673353 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:37 crc kubenswrapper[4675]: E0320 16:04:37.673445 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:39 crc kubenswrapper[4675]: I0320 16:04:39.672981 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:39 crc kubenswrapper[4675]: I0320 16:04:39.673027 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:39 crc kubenswrapper[4675]: I0320 16:04:39.673032 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:39 crc kubenswrapper[4675]: I0320 16:04:39.673105 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:39 crc kubenswrapper[4675]: E0320 16:04:39.673219 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:39 crc kubenswrapper[4675]: E0320 16:04:39.673314 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mrjmp" podUID="dfd7e79e-d566-4cfc-80b0-b8ff3a489837" Mar 20 16:04:39 crc kubenswrapper[4675]: E0320 16:04:39.673447 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:39 crc kubenswrapper[4675]: E0320 16:04:39.673610 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.673570 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.674215 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.674857 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.675389 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.679728 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.680025 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.680087 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.680189 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.680328 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 16:04:41 crc kubenswrapper[4675]: I0320 16:04:41.681166 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.685573 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:47 crc kubenswrapper[4675]: E0320 16:04:47.685812 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:06:49.68574084 +0000 UTC m=+329.719370417 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.686031 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.686113 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.687401 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.696755 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.718078 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.787861 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.787985 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.791720 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:47 crc kubenswrapper[4675]: I0320 16:04:47.792891 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.003563 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.046622 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:48 crc kubenswrapper[4675]: W0320 16:04:48.222029 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-95d8a0edc03d2d8c538a7042f96a2d1999cd301bd910a39fc732763ec07301be WatchSource:0}: Error finding container 95d8a0edc03d2d8c538a7042f96a2d1999cd301bd910a39fc732763ec07301be: Status 404 returned error can't find the container with id 95d8a0edc03d2d8c538a7042f96a2d1999cd301bd910a39fc732763ec07301be Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.608032 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"33f97b3e53bc045b068b8db451af502620ddf61b738b0f6c1cfacf7711e7e38d"} Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.608084 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"95d8a0edc03d2d8c538a7042f96a2d1999cd301bd910a39fc732763ec07301be"} Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.611100 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c6667688a607f36d634d9edb417d2ca65804b28fec086cb6c7fd7dbdb20051b1"} Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.611204 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f0e9b5b482e813cbc2c5cfcbe2a038f014031a978a9d0e33c05be09c37047d6"} Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.613089 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aec4ffbf4be212f9abdc4a4a6bf52569ae3b6985cadbb5af790ca7b8a6827ab6"} Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.613158 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6edba896092e79517133f1b5dd9c4e0b1a6f5db33cfc22b41e9863f5efe53d8e"} Mar 20 16:04:48 crc kubenswrapper[4675]: I0320 16:04:48.613409 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:50 crc kubenswrapper[4675]: I0320 16:04:50.996117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.041062 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ccrx9"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.041831 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.042838 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds6wn"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.043465 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.047848 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.049370 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.049531 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.049671 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.049825 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.050029 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.050414 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.050569 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.050603 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.051238 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.051261 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.051296 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.052247 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.053016 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.054554 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wzl6m"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.054991 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.060139 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-htpxv"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.060899 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.061034 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.062638 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9tmzc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.063536 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.064195 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.064842 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.065503 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.065181 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rpvlc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.066375 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.065874 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.067290 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.067517 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.066103 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.067924 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.066659 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.072799 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.073433 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.074082 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.074361 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.070992 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.074561 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.074782 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.077980 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.078736 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.085727 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.086510 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.086881 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.087700 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.087963 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.087998 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.088121 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.088263 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.088415 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.088447 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.088927 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089051 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089135 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089214 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089336 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089062 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089445 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089502 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.089691 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.090679 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.091467 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.092612 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vl96h"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.093437 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.093959 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.094264 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.094394 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.094561 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.095178 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.095403 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.096049 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.099056 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.099134 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.099344 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.099475 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.099499 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.101404 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.103345 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k6cqj"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.103404 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.103866 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-q9jhd"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.104335 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qdwzs"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.104379 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.104900 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.104978 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.106100 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.106889 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.107553 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.107868 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.108035 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.108446 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.108520 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.109302 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.109368 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.109310 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.109967 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110014 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110250 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110284 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110321 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110252 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110498 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110593 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110702 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110888 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.110926 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.111042 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.111045 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.111145 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.111247 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mz2tv"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.111261 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.111801 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.112717 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.112915 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.113018 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.117219 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.117485 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.117818 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.117972 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118118 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118151 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118262 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118343 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118453 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118486 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118661 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118802 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118837 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.117984 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.118667 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.119239 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.119326 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.119504 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.119680 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-cnrtx"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.119699 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.120530 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.120838 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.121224 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.122861 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gnrqz"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.123140 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.124264 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.124655 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.124985 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.125734 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.129529 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.130305 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.132119 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139216 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139620 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139625 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139664 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9040e8e8-dcda-4de9-b015-0cc0e947858d-serving-cert\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139689 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvrk\" (UniqueName: \"kubernetes.io/projected/a67fae4e-e87c-48bc-83e4-0bc553cf5904-kube-api-access-sdvrk\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139710 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-images\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139724 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67fae4e-e87c-48bc-83e4-0bc553cf5904-serving-cert\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139759 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv5tz\" (UniqueName: \"kubernetes.io/projected/08f5035b-6ecf-49dc-8317-d40e5675a472-kube-api-access-gv5tz\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139794 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-config\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139812 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-config\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139832 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-client\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139869 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d96a004-baaa-4e15-af6b-e25b8e503958-config\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139894 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-config\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139921 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-policies\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139947 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.139971 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57069867-4c85-4262-ac54-05d5257ad81b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140011 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe3541d-1e23-45a3-9fac-823f53e92044-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140027 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbe3541d-1e23-45a3-9fac-823f53e92044-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140043 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-config\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140058 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/78129d19-d0a7-404f-96a0-4096b7d7f375-machine-approver-tls\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140073 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140092 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-service-ca\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140110 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fb2328-2088-4d1f-a731-dc276b678a94-serving-cert\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140126 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt7ff\" (UniqueName: \"kubernetes.io/projected/20519e1c-a631-42ac-8bcb-c5a18b3ac4b0-kube-api-access-qt7ff\") pod \"dns-operator-744455d44c-9tmzc\" (UID: \"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140144 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec3b15e-e8c4-4f4b-9153-4014bbf77c86-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gcwfs\" (UID: \"fec3b15e-e8c4-4f4b-9153-4014bbf77c86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140162 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140179 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zsm\" (UniqueName: \"kubernetes.io/projected/a5fb2328-2088-4d1f-a731-dc276b678a94-kube-api-access-r7zsm\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140205 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf42be81-82f9-47c4-a968-1c048e52d4f3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140220 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140235 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddqd\" (UniqueName: \"kubernetes.io/projected/57069867-4c85-4262-ac54-05d5257ad81b-kube-api-access-tddqd\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140257 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140274 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fznd\" (UniqueName: \"kubernetes.io/projected/78129d19-d0a7-404f-96a0-4096b7d7f375-kube-api-access-4fznd\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140389 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59253b32-b908-48ed-bfb6-d3374fbcd40b-serving-cert\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.140666 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141025 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78129d19-d0a7-404f-96a0-4096b7d7f375-config\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141054 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57069867-4c85-4262-ac54-05d5257ad81b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141073 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d80adb62-f7cc-4c49-98da-7a1167881907-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141089 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78129d19-d0a7-404f-96a0-4096b7d7f375-auth-proxy-config\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141112 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42be81-82f9-47c4-a968-1c048e52d4f3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141129 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2z5x\" (UniqueName: \"kubernetes.io/projected/fec3b15e-e8c4-4f4b-9153-4014bbf77c86-kube-api-access-m2z5x\") pod \"cluster-samples-operator-665b6dd947-gcwfs\" (UID: \"fec3b15e-e8c4-4f4b-9153-4014bbf77c86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141145 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b98a434-f0d3-415c-adbb-0ff614dac3e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141162 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141177 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5b2c\" (UniqueName: \"kubernetes.io/projected/9040e8e8-dcda-4de9-b015-0cc0e947858d-kube-api-access-m5b2c\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141278 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08f5035b-6ecf-49dc-8317-d40e5675a472-trusted-ca\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141297 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08f5035b-6ecf-49dc-8317-d40e5675a472-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141314 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141367 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141398 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2xtt\" (UniqueName: \"kubernetes.io/projected/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-kube-api-access-v2xtt\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141441 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141459 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08f5035b-6ecf-49dc-8317-d40e5675a472-metrics-tls\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.141503 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143310 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143661 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143695 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqqw8\" (UniqueName: \"kubernetes.io/projected/1b98a434-f0d3-415c-adbb-0ff614dac3e4-kube-api-access-rqqw8\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143719 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-dir\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvn8q\" (UniqueName: \"kubernetes.io/projected/d80adb62-f7cc-4c49-98da-7a1167881907-kube-api-access-fvn8q\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143811 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20519e1c-a631-42ac-8bcb-c5a18b3ac4b0-metrics-tls\") pod \"dns-operator-744455d44c-9tmzc\" (UID: \"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143838 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bpl\" (UniqueName: \"kubernetes.io/projected/6d96a004-baaa-4e15-af6b-e25b8e503958-kube-api-access-f7bpl\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143872 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d96a004-baaa-4e15-af6b-e25b8e503958-serving-cert\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.143955 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-client-ca\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144015 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe3541d-1e23-45a3-9fac-823f53e92044-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144049 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984fz\" (UniqueName: \"kubernetes.io/projected/59253b32-b908-48ed-bfb6-d3374fbcd40b-kube-api-access-984fz\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144083 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b98a434-f0d3-415c-adbb-0ff614dac3e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144099 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144126 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144146 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-service-ca-bundle\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144166 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d80adb62-f7cc-4c49-98da-7a1167881907-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144186 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80adb62-f7cc-4c49-98da-7a1167881907-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144200 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-ca\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144224 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144243 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pnnc\" (UniqueName: \"kubernetes.io/projected/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-kube-api-access-9pnnc\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144256 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-config\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144282 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42be81-82f9-47c4-a968-1c048e52d4f3-config\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144298 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7h8\" (UniqueName: \"kubernetes.io/projected/3074e872-f732-42c6-b7c3-6a88e0f5b81c-kube-api-access-9r7h8\") pod \"downloads-7954f5f757-ccrx9\" (UID: \"3074e872-f732-42c6-b7c3-6a88e0f5b81c\") " pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144311 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a67fae4e-e87c-48bc-83e4-0bc553cf5904-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144327 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144367 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d96a004-baaa-4e15-af6b-e25b8e503958-trusted-ca\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144904 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.144929 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.145024 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.145345 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.145411 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.145629 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.146043 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.146246 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.147269 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.148916 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.149207 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.149467 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.151703 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.153047 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.154216 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.154354 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.155262 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vh65f"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.155720 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dxwrc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.156139 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.156263 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567044-wld22"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.156349 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.156653 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.156671 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.158832 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gx27v"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.159370 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds6wn"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.159446 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.159753 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ccrx9"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.160031 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.165076 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f8nq8"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.165746 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.166066 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.166317 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.167166 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.168858 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wzl6m"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.171794 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.172588 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rpvlc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.175595 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.178583 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.180660 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.183246 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cnrtx"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.184650 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vl96h"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.184673 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.188718 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.192120 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.192148 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.207687 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qdwzs"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.209796 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.210461 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.225833 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.227877 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.228458 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.239905 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9tmzc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.241397 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mz2tv"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.243333 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-b5d8k"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.243930 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.244026 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.244683 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qn4j5"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.244918 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvrk\" (UniqueName: \"kubernetes.io/projected/a67fae4e-e87c-48bc-83e4-0bc553cf5904-kube-api-access-sdvrk\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.244953 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67fae4e-e87c-48bc-83e4-0bc553cf5904-serving-cert\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.244979 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8vv\" (UniqueName: \"kubernetes.io/projected/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-kube-api-access-vd8vv\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245006 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-images\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245027 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245047 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz6lt\" (UniqueName: \"kubernetes.io/projected/c311c63c-0f7e-4435-a2e3-fbc85a59594e-kube-api-access-lz6lt\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245065 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d96c\" (UniqueName: \"kubernetes.io/projected/14695b26-4567-40f0-a892-25172bd0fb0a-kube-api-access-5d96c\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245086 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-client\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245101 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-audit-policies\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245116 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d96a004-baaa-4e15-af6b-e25b8e503958-config\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245132 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7wr\" (UniqueName: \"kubernetes.io/projected/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-kube-api-access-xm7wr\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245149 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245165 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57069867-4c85-4262-ac54-05d5257ad81b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245180 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe3541d-1e23-45a3-9fac-823f53e92044-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245195 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec3b15e-e8c4-4f4b-9153-4014bbf77c86-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gcwfs\" (UID: \"fec3b15e-e8c4-4f4b-9153-4014bbf77c86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245210 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-tmpfs\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245225 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-webhook-cert\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245240 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245253 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zdf\" (UniqueName: \"kubernetes.io/projected/ffe62543-215b-47d8-9e48-de4466ce84f2-kube-api-access-k7zdf\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245268 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245284 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-encryption-config\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245299 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-profile-collector-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245320 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf42be81-82f9-47c4-a968-1c048e52d4f3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245337 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245353 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvb9p\" (UniqueName: \"kubernetes.io/projected/1bebdfb7-34f5-4e90-b64e-c1442738c51d-kube-api-access-wvb9p\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245368 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245385 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78129d19-d0a7-404f-96a0-4096b7d7f375-config\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245400 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-serving-cert\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245414 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4p4\" (UniqueName: \"kubernetes.io/projected/9ec164c8-422d-443c-aed4-b30304c06694-kube-api-access-cr4p4\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245430 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2z5x\" (UniqueName: \"kubernetes.io/projected/fec3b15e-e8c4-4f4b-9153-4014bbf77c86-kube-api-access-m2z5x\") pod \"cluster-samples-operator-665b6dd947-gcwfs\" (UID: \"fec3b15e-e8c4-4f4b-9153-4014bbf77c86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245447 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b98a434-f0d3-415c-adbb-0ff614dac3e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245463 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5b2c\" (UniqueName: \"kubernetes.io/projected/9040e8e8-dcda-4de9-b015-0cc0e947858d-kube-api-access-m5b2c\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245481 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08f5035b-6ecf-49dc-8317-d40e5675a472-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245496 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245513 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08f5035b-6ecf-49dc-8317-d40e5675a472-trusted-ca\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245545 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245561 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvt6f\" (UniqueName: \"kubernetes.io/projected/ba4d60b3-764c-4378-ba52-23f712ab9eb0-kube-api-access-zvt6f\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245575 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-key\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245590 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-serving-cert\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245608 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqqw8\" (UniqueName: \"kubernetes.io/projected/1b98a434-f0d3-415c-adbb-0ff614dac3e4-kube-api-access-rqqw8\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245623 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bpl\" (UniqueName: \"kubernetes.io/projected/6d96a004-baaa-4e15-af6b-e25b8e503958-kube-api-access-f7bpl\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245640 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d96a004-baaa-4e15-af6b-e25b8e503958-serving-cert\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245656 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe3541d-1e23-45a3-9fac-823f53e92044-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245671 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b98a434-f0d3-415c-adbb-0ff614dac3e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245689 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245731 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-service-ca-bundle\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245748 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-cabundle\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245780 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mghc\" (UniqueName: \"kubernetes.io/projected/45ca71c6-ab6e-4f92-ba2f-88096793d64b-kube-api-access-9mghc\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245798 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245812 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245831 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42be81-82f9-47c4-a968-1c048e52d4f3-config\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245847 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7h8\" (UniqueName: \"kubernetes.io/projected/3074e872-f732-42c6-b7c3-6a88e0f5b81c-kube-api-access-9r7h8\") pod \"downloads-7954f5f757-ccrx9\" (UID: \"3074e872-f732-42c6-b7c3-6a88e0f5b81c\") " pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245862 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d96a004-baaa-4e15-af6b-e25b8e503958-trusted-ca\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245878 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkrx\" (UniqueName: \"kubernetes.io/projected/d8efdad2-e2fa-4003-bd03-117a399b9df0-kube-api-access-rvkrx\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245898 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245914 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0751b5-dda7-4346-bb7f-927d886a955b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245932 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245947 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-config\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245965 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv5tz\" (UniqueName: \"kubernetes.io/projected/08f5035b-6ecf-49dc-8317-d40e5675a472-kube-api-access-gv5tz\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245981 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-metrics-certs\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245998 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-config\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246021 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/585603c7-dfdf-4343-a32d-500c6868137e-images\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246037 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-config\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246060 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjw8b\" (UniqueName: \"kubernetes.io/projected/585603c7-dfdf-4343-a32d-500c6868137e-kube-api-access-wjw8b\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246083 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwfd\" (UniqueName: \"kubernetes.io/projected/5e0751b5-dda7-4346-bb7f-927d886a955b-kube-api-access-sxwfd\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246098 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-stats-auth\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246116 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-config\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246130 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-policies\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246147 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246162 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-service-ca\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246177 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbe3541d-1e23-45a3-9fac-823f53e92044-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246194 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-config\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246208 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/78129d19-d0a7-404f-96a0-4096b7d7f375-machine-approver-tls\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246223 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fb2328-2088-4d1f-a731-dc276b678a94-serving-cert\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246239 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt7ff\" (UniqueName: \"kubernetes.io/projected/20519e1c-a631-42ac-8bcb-c5a18b3ac4b0-kube-api-access-qt7ff\") pod \"dns-operator-744455d44c-9tmzc\" (UID: \"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246254 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-default-certificate\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246272 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zsm\" (UniqueName: \"kubernetes.io/projected/a5fb2328-2088-4d1f-a731-dc276b678a94-kube-api-access-r7zsm\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246288 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddqd\" (UniqueName: \"kubernetes.io/projected/57069867-4c85-4262-ac54-05d5257ad81b-kube-api-access-tddqd\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246304 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-etcd-client\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246320 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcvk\" (UniqueName: \"kubernetes.io/projected/d0bf4e3e-d741-4ff3-a0cc-6280d16cd533-kube-api-access-fbcvk\") pod \"migrator-59844c95c7-49qhm\" (UID: \"d0bf4e3e-d741-4ff3-a0cc-6280d16cd533\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246336 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba4d60b3-764c-4378-ba52-23f712ab9eb0-service-ca-bundle\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246351 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fznd\" (UniqueName: \"kubernetes.io/projected/78129d19-d0a7-404f-96a0-4096b7d7f375-kube-api-access-4fznd\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246366 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/585603c7-dfdf-4343-a32d-500c6868137e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246382 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-client-ca\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246397 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0751b5-dda7-4346-bb7f-927d886a955b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246413 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57069867-4c85-4262-ac54-05d5257ad81b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246428 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d80adb62-f7cc-4c49-98da-7a1167881907-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246445 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14695b26-4567-40f0-a892-25172bd0fb0a-audit-dir\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246460 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8efdad2-e2fa-4003-bd03-117a399b9df0-proxy-tls\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246477 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59253b32-b908-48ed-bfb6-d3374fbcd40b-serving-cert\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246492 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78129d19-d0a7-404f-96a0-4096b7d7f375-auth-proxy-config\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246509 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbhf\" (UniqueName: \"kubernetes.io/projected/9ca3b7f0-d36c-487e-938c-da2d8781061a-kube-api-access-xdbhf\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246525 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/c6d2332a-bd88-45d7-8645-63778001dd65-kube-api-access-2xv82\") pod \"auto-csr-approver-29567044-wld22\" (UID: \"c6d2332a-bd88-45d7-8645-63778001dd65\") " pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246541 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42be81-82f9-47c4-a968-1c048e52d4f3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246556 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246572 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8efdad2-e2fa-4003-bd03-117a399b9df0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246592 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2xtt\" (UniqueName: \"kubernetes.io/projected/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-kube-api-access-v2xtt\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246607 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/585603c7-dfdf-4343-a32d-500c6868137e-proxy-tls\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246624 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246639 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08f5035b-6ecf-49dc-8317-d40e5675a472-metrics-tls\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246656 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246673 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvn8q\" (UniqueName: \"kubernetes.io/projected/d80adb62-f7cc-4c49-98da-7a1167881907-kube-api-access-fvn8q\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246689 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-dir\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246706 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20519e1c-a631-42ac-8bcb-c5a18b3ac4b0-metrics-tls\") pod \"dns-operator-744455d44c-9tmzc\" (UID: \"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246723 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-apiservice-cert\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246738 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-client-ca\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246870 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgpls\" (UniqueName: \"kubernetes.io/projected/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-kube-api-access-qgpls\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246889 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246905 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-srv-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246924 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-984fz\" (UniqueName: \"kubernetes.io/projected/59253b32-b908-48ed-bfb6-d3374fbcd40b-kube-api-access-984fz\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246940 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246955 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ec164c8-422d-443c-aed4-b30304c06694-metrics-tls\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246972 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d80adb62-f7cc-4c49-98da-7a1167881907-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.246992 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-ca\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247010 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80adb62-f7cc-4c49-98da-7a1167881907-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247027 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247044 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pnnc\" (UniqueName: \"kubernetes.io/projected/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-kube-api-access-9pnnc\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247059 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-config\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247077 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a67fae4e-e87c-48bc-83e4-0bc553cf5904-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247094 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247118 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247135 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247149 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ec164c8-422d-443c-aed4-b30304c06694-config-volume\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247167 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247184 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9040e8e8-dcda-4de9-b015-0cc0e947858d-serving-cert\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247199 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-srv-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247214 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksnnh\" (UniqueName: \"kubernetes.io/projected/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-kube-api-access-ksnnh\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.247829 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57069867-4c85-4262-ac54-05d5257ad81b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.248413 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-images\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.248628 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-service-ca-bundle\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.248864 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe3541d-1e23-45a3-9fac-823f53e92044-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.249418 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-client-ca\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.249579 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-config\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.250250 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-config\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.252305 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.252457 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-policies\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.252458 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.253526 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59253b32-b908-48ed-bfb6-d3374fbcd40b-serving-cert\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.253913 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08f5035b-6ecf-49dc-8317-d40e5675a472-trusted-ca\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.255146 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67fae4e-e87c-48bc-83e4-0bc553cf5904-serving-cert\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.255363 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec3b15e-e8c4-4f4b-9153-4014bbf77c86-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gcwfs\" (UID: \"fec3b15e-e8c4-4f4b-9153-4014bbf77c86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.255565 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.245815 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.256563 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57069867-4c85-4262-ac54-05d5257ad81b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.257530 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d80adb62-f7cc-4c49-98da-7a1167881907-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.257832 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fb2328-2088-4d1f-a731-dc276b678a94-config\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.258133 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a67fae4e-e87c-48bc-83e4-0bc553cf5904-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.259425 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42be81-82f9-47c4-a968-1c048e52d4f3-config\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.259480 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.260172 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b98a434-f0d3-415c-adbb-0ff614dac3e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.260404 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.260637 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.261412 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-dir\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.261532 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-ca\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.261844 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.262211 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78129d19-d0a7-404f-96a0-4096b7d7f375-config\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.262507 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78129d19-d0a7-404f-96a0-4096b7d7f375-auth-proxy-config\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.262571 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80adb62-f7cc-4c49-98da-7a1167881907-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.263224 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-config\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.263603 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08f5035b-6ecf-49dc-8317-d40e5675a472-metrics-tls\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.263676 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-wld22"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.263708 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.264822 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.265354 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.265461 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-service-ca\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.266676 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.266829 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.268689 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42be81-82f9-47c4-a968-1c048e52d4f3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.268841 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5fb2328-2088-4d1f-a731-dc276b678a94-serving-cert\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.269174 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.269380 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b98a434-f0d3-415c-adbb-0ff614dac3e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.270394 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-htpxv"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.271205 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-config\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.271503 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9040e8e8-dcda-4de9-b015-0cc0e947858d-etcd-client\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.272022 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20519e1c-a631-42ac-8bcb-c5a18b3ac4b0-metrics-tls\") pod \"dns-operator-744455d44c-9tmzc\" (UID: \"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.272025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe3541d-1e23-45a3-9fac-823f53e92044-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.273246 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.274060 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/78129d19-d0a7-404f-96a0-4096b7d7f375-machine-approver-tls\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.274468 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.275475 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.276134 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vh65f"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.276323 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9040e8e8-dcda-4de9-b015-0cc0e947858d-serving-cert\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.276333 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.277532 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.278947 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gnrqz"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.283874 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.283928 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.285458 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.291528 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dxwrc"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.293344 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.294342 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.297337 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k6cqj"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.298752 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.300123 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f8nq8"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.301468 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.302890 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gx27v"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.304021 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.304154 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.305458 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qn4j5"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.306861 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-699d9"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.307479 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.308183 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-699d9"] Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.323587 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.343419 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347677 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-etcd-client\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347710 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcvk\" (UniqueName: \"kubernetes.io/projected/d0bf4e3e-d741-4ff3-a0cc-6280d16cd533-kube-api-access-fbcvk\") pod \"migrator-59844c95c7-49qhm\" (UID: \"d0bf4e3e-d741-4ff3-a0cc-6280d16cd533\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347745 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0751b5-dda7-4346-bb7f-927d886a955b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347809 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba4d60b3-764c-4378-ba52-23f712ab9eb0-service-ca-bundle\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347843 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/585603c7-dfdf-4343-a32d-500c6868137e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347864 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-client-ca\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347905 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14695b26-4567-40f0-a892-25172bd0fb0a-audit-dir\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347926 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8efdad2-e2fa-4003-bd03-117a399b9df0-proxy-tls\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347950 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbhf\" (UniqueName: \"kubernetes.io/projected/9ca3b7f0-d36c-487e-938c-da2d8781061a-kube-api-access-xdbhf\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347969 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/c6d2332a-bd88-45d7-8645-63778001dd65-kube-api-access-2xv82\") pod \"auto-csr-approver-29567044-wld22\" (UID: \"c6d2332a-bd88-45d7-8645-63778001dd65\") " pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347991 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8efdad2-e2fa-4003-bd03-117a399b9df0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.347993 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14695b26-4567-40f0-a892-25172bd0fb0a-audit-dir\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348056 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/585603c7-dfdf-4343-a32d-500c6868137e-proxy-tls\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348094 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-apiservice-cert\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348117 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgpls\" (UniqueName: \"kubernetes.io/projected/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-kube-api-access-qgpls\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348139 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348159 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ec164c8-422d-443c-aed4-b30304c06694-metrics-tls\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348180 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-srv-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348211 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348247 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348268 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-srv-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348291 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348312 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ec164c8-422d-443c-aed4-b30304c06694-config-volume\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348333 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksnnh\" (UniqueName: \"kubernetes.io/projected/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-kube-api-access-ksnnh\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348362 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8vv\" (UniqueName: \"kubernetes.io/projected/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-kube-api-access-vd8vv\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348384 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348406 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz6lt\" (UniqueName: \"kubernetes.io/projected/c311c63c-0f7e-4435-a2e3-fbc85a59594e-kube-api-access-lz6lt\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348430 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-audit-policies\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348449 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d96c\" (UniqueName: \"kubernetes.io/projected/14695b26-4567-40f0-a892-25172bd0fb0a-kube-api-access-5d96c\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348476 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7wr\" (UniqueName: \"kubernetes.io/projected/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-kube-api-access-xm7wr\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348497 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-tmpfs\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348517 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zdf\" (UniqueName: \"kubernetes.io/projected/ffe62543-215b-47d8-9e48-de4466ce84f2-kube-api-access-k7zdf\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348536 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-webhook-cert\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348556 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348574 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-profile-collector-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348596 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-encryption-config\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348633 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvb9p\" (UniqueName: \"kubernetes.io/projected/1bebdfb7-34f5-4e90-b64e-c1442738c51d-kube-api-access-wvb9p\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348664 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-serving-cert\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348686 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4p4\" (UniqueName: \"kubernetes.io/projected/9ec164c8-422d-443c-aed4-b30304c06694-kube-api-access-cr4p4\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348741 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348779 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvt6f\" (UniqueName: \"kubernetes.io/projected/ba4d60b3-764c-4378-ba52-23f712ab9eb0-kube-api-access-zvt6f\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348802 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-key\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.348824 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-serving-cert\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349101 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba4d60b3-764c-4378-ba52-23f712ab9eb0-service-ca-bundle\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349410 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-cabundle\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349440 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mghc\" (UniqueName: \"kubernetes.io/projected/45ca71c6-ab6e-4f92-ba2f-88096793d64b-kube-api-access-9mghc\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349443 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/585603c7-dfdf-4343-a32d-500c6868137e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349466 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349536 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkrx\" (UniqueName: \"kubernetes.io/projected/d8efdad2-e2fa-4003-bd03-117a399b9df0-kube-api-access-rvkrx\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349575 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349600 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-tmpfs\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349613 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0751b5-dda7-4346-bb7f-927d886a955b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349662 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-config\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349693 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-metrics-certs\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349716 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/585603c7-dfdf-4343-a32d-500c6868137e-images\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349736 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8efdad2-e2fa-4003-bd03-117a399b9df0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349743 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjw8b\" (UniqueName: \"kubernetes.io/projected/585603c7-dfdf-4343-a32d-500c6868137e-kube-api-access-wjw8b\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349810 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-stats-auth\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349831 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwfd\" (UniqueName: \"kubernetes.io/projected/5e0751b5-dda7-4346-bb7f-927d886a955b-kube-api-access-sxwfd\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.349860 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-default-certificate\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.350092 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-audit-policies\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.350326 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.350326 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14695b26-4567-40f0-a892-25172bd0fb0a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.350863 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-etcd-client\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.351351 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-encryption-config\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.352482 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14695b26-4567-40f0-a892-25172bd0fb0a-serving-cert\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.353077 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-stats-auth\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.353656 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-metrics-certs\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.354376 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ba4d60b3-764c-4378-ba52-23f712ab9eb0-default-certificate\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.355363 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d96a004-baaa-4e15-af6b-e25b8e503958-serving-cert\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.363758 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.368850 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d96a004-baaa-4e15-af6b-e25b8e503958-config\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.388491 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.390893 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d96a004-baaa-4e15-af6b-e25b8e503958-trusted-ca\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.404008 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.435326 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.443751 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.463821 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.484582 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.504842 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.523876 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.544250 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.563877 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.597741 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.603595 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.623684 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.643719 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.652778 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0751b5-dda7-4346-bb7f-927d886a955b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.664473 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.684000 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.704345 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.709691 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0751b5-dda7-4346-bb7f-927d886a955b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.724917 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.744054 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.764668 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.784336 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.804570 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.810434 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/585603c7-dfdf-4343-a32d-500c6868137e-images\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.824270 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.844848 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.852351 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/585603c7-dfdf-4343-a32d-500c6868137e-proxy-tls\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.864593 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.874553 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-serving-cert\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.884277 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.905344 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.914866 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8efdad2-e2fa-4003-bd03-117a399b9df0-proxy-tls\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.924435 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.929485 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-client-ca\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.944898 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.951448 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-config\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.965043 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 16:04:51 crc kubenswrapper[4675]: I0320 16:04:51.985567 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.005077 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.025570 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.033729 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.051890 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.060659 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.063824 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.084414 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.105315 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.131609 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.142636 4675 request.go:700] Waited for 1.003222457s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-dockercfg-5nsgg&limit=500&resourceVersion=0 Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.144686 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.164504 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.183629 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.205923 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.224810 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.244517 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.264958 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.284164 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.304136 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.326567 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.345064 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349494 4675 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349493 4675 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349675 4675 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349715 4675 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349713 4675 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349798 4675 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349855 4675 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349860 4675 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349894 4675 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.349693 4675 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.350056 4675 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352334 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ec164c8-422d-443c-aed4-b30304c06694-config-volume podName:9ec164c8-422d-443c-aed4-b30304c06694 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852290294 +0000 UTC m=+212.885919921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/9ec164c8-422d-443c-aed4-b30304c06694-config-volume") pod "dns-default-f8nq8" (UID: "9ec164c8-422d-443c-aed4-b30304c06694") : failed to sync configmap cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352401 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-webhook-certs podName:bbaf20e3-9148-4012-aa70-0c8ba76e02c2 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852382347 +0000 UTC m=+212.886011924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-webhook-certs") pod "multus-admission-controller-857f4d67dd-dxwrc" (UID: "bbaf20e3-9148-4012-aa70-0c8ba76e02c2") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352440 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-profile-collector-cert podName:45ca71c6-ab6e-4f92-ba2f-88096793d64b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852424198 +0000 UTC m=+212.886053885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-profile-collector-cert") pod "olm-operator-6b444d44fb-qhnk6" (UID: "45ca71c6-ab6e-4f92-ba2f-88096793d64b") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352481 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume podName:9ca3b7f0-d36c-487e-938c-da2d8781061a nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852466379 +0000 UTC m=+212.886095946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume") pod "collect-profiles-29567040-fncng" (UID: "9ca3b7f0-d36c-487e-938c-da2d8781061a") : failed to sync configmap cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352524 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-cabundle podName:ffe62543-215b-47d8-9e48-de4466ce84f2 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.8525092 +0000 UTC m=+212.886138777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-cabundle") pod "service-ca-9c57cc56f-gx27v" (UID: "ffe62543-215b-47d8-9e48-de4466ce84f2") : failed to sync configmap cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352555 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec164c8-422d-443c-aed4-b30304c06694-metrics-tls podName:9ec164c8-422d-443c-aed4-b30304c06694 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852540541 +0000 UTC m=+212.886170168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ec164c8-422d-443c-aed4-b30304c06694-metrics-tls") pod "dns-default-f8nq8" (UID: "9ec164c8-422d-443c-aed4-b30304c06694") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352652 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-key podName:ffe62543-215b-47d8-9e48-de4466ce84f2 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852629654 +0000 UTC m=+212.886259231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-key") pod "service-ca-9c57cc56f-gx27v" (UID: "ffe62543-215b-47d8-9e48-de4466ce84f2") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352735 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-profile-collector-cert podName:1bebdfb7-34f5-4e90-b64e-c1442738c51d nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852714816 +0000 UTC m=+212.886344383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-profile-collector-cert") pod "catalog-operator-68c6474976-knk82" (UID: "1bebdfb7-34f5-4e90-b64e-c1442738c51d") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.352860 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-package-server-manager-serving-cert podName:29b966ff-c9bd-42b4-bf25-6f942fc2bb4d nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.852754207 +0000 UTC m=+212.886383784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-dd98v" (UID: "29b966ff-c9bd-42b4-bf25-6f942fc2bb4d") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.353042 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-srv-cert podName:1bebdfb7-34f5-4e90-b64e-c1442738c51d nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.853023145 +0000 UTC m=+212.886652712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-srv-cert") pod "catalog-operator-68c6474976-knk82" (UID: "1bebdfb7-34f5-4e90-b64e-c1442738c51d") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.353377 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume podName:9ca3b7f0-d36c-487e-938c-da2d8781061a nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.853055616 +0000 UTC m=+212.886685183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume") pod "collect-profiles-29567040-fncng" (UID: "9ca3b7f0-d36c-487e-938c-da2d8781061a") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.355044 4675 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: E0320 16:04:52.355231 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-srv-cert podName:45ca71c6-ab6e-4f92-ba2f-88096793d64b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:52.855199458 +0000 UTC m=+212.888829075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-srv-cert") pod "olm-operator-6b444d44fb-qhnk6" (UID: "45ca71c6-ab6e-4f92-ba2f-88096793d64b") : failed to sync secret cache: timed out waiting for the condition Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.359518 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-webhook-cert\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.360437 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-apiservice-cert\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.365489 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.384670 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.405082 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.424006 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.443745 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.465184 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.483655 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.504124 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.524576 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.544735 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.564280 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.584117 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.604873 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.625227 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.644100 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.664205 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.684968 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.704186 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.724366 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.744735 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.765197 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.784200 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.803793 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.825333 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.844573 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.865246 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883127 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ec164c8-422d-443c-aed4-b30304c06694-metrics-tls\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883180 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-srv-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883229 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883282 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883317 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-srv-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883349 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883380 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ec164c8-422d-443c-aed4-b30304c06694-config-volume\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.883515 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-profile-collector-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.884271 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ec164c8-422d-443c-aed4-b30304c06694-config-volume\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.884509 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.884566 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-key\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.884625 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-cabundle\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.884674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.884957 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.886093 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.887096 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-cabundle\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.887545 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.888738 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ffe62543-215b-47d8-9e48-de4466ce84f2-signing-key\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.889093 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-profile-collector-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.890820 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.892450 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.892961 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.893994 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45ca71c6-ab6e-4f92-ba2f-88096793d64b-srv-cert\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.897243 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bebdfb7-34f5-4e90-b64e-c1442738c51d-srv-cert\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.897927 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ec164c8-422d-443c-aed4-b30304c06694-metrics-tls\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.945355 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.964587 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 16:04:52 crc kubenswrapper[4675]: I0320 16:04:52.983873 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.037050 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvrk\" (UniqueName: \"kubernetes.io/projected/a67fae4e-e87c-48bc-83e4-0bc553cf5904-kube-api-access-sdvrk\") pod \"openshift-config-operator-7777fb866f-wgwkt\" (UID: \"a67fae4e-e87c-48bc-83e4-0bc553cf5904\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.052814 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fznd\" (UniqueName: \"kubernetes.io/projected/78129d19-d0a7-404f-96a0-4096b7d7f375-kube-api-access-4fznd\") pod \"machine-approver-56656f9798-hhqgm\" (UID: \"78129d19-d0a7-404f-96a0-4096b7d7f375\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.069743 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqqw8\" (UniqueName: \"kubernetes.io/projected/1b98a434-f0d3-415c-adbb-0ff614dac3e4-kube-api-access-rqqw8\") pod \"openshift-controller-manager-operator-756b6f6bc6-gd5j8\" (UID: \"1b98a434-f0d3-415c-adbb-0ff614dac3e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.092489 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d80adb62-f7cc-4c49-98da-7a1167881907-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.104094 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2xtt\" (UniqueName: \"kubernetes.io/projected/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-kube-api-access-v2xtt\") pod \"oauth-openshift-558db77b4-rpvlc\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.104237 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.112742 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.125867 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-984fz\" (UniqueName: \"kubernetes.io/projected/59253b32-b908-48ed-bfb6-d3374fbcd40b-kube-api-access-984fz\") pod \"controller-manager-879f6c89f-ds6wn\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:53 crc kubenswrapper[4675]: W0320 16:04:53.136322 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78129d19_d0a7_404f_96a0_4096b7d7f375.slice/crio-0cca714ed6a7ecf21a306bf71732a7ab80a976af826e9a4a82e47af84b3ff1b5 WatchSource:0}: Error finding container 0cca714ed6a7ecf21a306bf71732a7ab80a976af826e9a4a82e47af84b3ff1b5: Status 404 returned error can't find the container with id 0cca714ed6a7ecf21a306bf71732a7ab80a976af826e9a4a82e47af84b3ff1b5 Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.142497 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1e1fc22-8efc-4ca9-a3c4-4736385a1138-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-649kg\" (UID: \"d1e1fc22-8efc-4ca9-a3c4-4736385a1138\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.144953 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.162578 4675 request.go:700] Waited for 1.905801849s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.187264 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pnnc\" (UniqueName: \"kubernetes.io/projected/ceebe5f6-3cee-41d3-ab16-9d562cff84f8-kube-api-access-9pnnc\") pod \"machine-api-operator-5694c8668f-wzl6m\" (UID: \"ceebe5f6-3cee-41d3-ab16-9d562cff84f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.187445 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.191345 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.212446 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7h8\" (UniqueName: \"kubernetes.io/projected/3074e872-f732-42c6-b7c3-6a88e0f5b81c-kube-api-access-9r7h8\") pod \"downloads-7954f5f757-ccrx9\" (UID: \"3074e872-f732-42c6-b7c3-6a88e0f5b81c\") " pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.222883 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bpl\" (UniqueName: \"kubernetes.io/projected/6d96a004-baaa-4e15-af6b-e25b8e503958-kube-api-access-f7bpl\") pod \"console-operator-58897d9998-k6cqj\" (UID: \"6d96a004-baaa-4e15-af6b-e25b8e503958\") " pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.242453 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf42be81-82f9-47c4-a968-1c048e52d4f3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jmncz\" (UID: \"bf42be81-82f9-47c4-a968-1c048e52d4f3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.251819 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.273280 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvn8q\" (UniqueName: \"kubernetes.io/projected/d80adb62-f7cc-4c49-98da-7a1167881907-kube-api-access-fvn8q\") pod \"cluster-image-registry-operator-dc59b4c8b-zmf9j\" (UID: \"d80adb62-f7cc-4c49-98da-7a1167881907\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.282503 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.284113 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2z5x\" (UniqueName: \"kubernetes.io/projected/fec3b15e-e8c4-4f4b-9153-4014bbf77c86-kube-api-access-m2z5x\") pod \"cluster-samples-operator-665b6dd947-gcwfs\" (UID: \"fec3b15e-e8c4-4f4b-9153-4014bbf77c86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.294120 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.304424 4675 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.304939 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv5tz\" (UniqueName: \"kubernetes.io/projected/08f5035b-6ecf-49dc-8317-d40e5675a472-kube-api-access-gv5tz\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.340277 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08f5035b-6ecf-49dc-8317-d40e5675a472-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sdxzc\" (UID: \"08f5035b-6ecf-49dc-8317-d40e5675a472\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.359870 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5b2c\" (UniqueName: \"kubernetes.io/projected/9040e8e8-dcda-4de9-b015-0cc0e947858d-kube-api-access-m5b2c\") pod \"etcd-operator-b45778765-vl96h\" (UID: \"9040e8e8-dcda-4de9-b015-0cc0e947858d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.392021 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt7ff\" (UniqueName: \"kubernetes.io/projected/20519e1c-a631-42ac-8bcb-c5a18b3ac4b0-kube-api-access-qt7ff\") pod \"dns-operator-744455d44c-9tmzc\" (UID: \"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0\") " pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.401009 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddqd\" (UniqueName: \"kubernetes.io/projected/57069867-4c85-4262-ac54-05d5257ad81b-kube-api-access-tddqd\") pod \"openshift-apiserver-operator-796bbdcf4f-r6cz9\" (UID: \"57069867-4c85-4262-ac54-05d5257ad81b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.404482 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.420358 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.427737 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.436582 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.440839 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zsm\" (UniqueName: \"kubernetes.io/projected/a5fb2328-2088-4d1f-a731-dc276b678a94-kube-api-access-r7zsm\") pod \"authentication-operator-69f744f599-htpxv\" (UID: \"a5fb2328-2088-4d1f-a731-dc276b678a94\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.444944 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds6wn"] Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.445602 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.451412 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.463698 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbe3541d-1e23-45a3-9fac-823f53e92044-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dq9pl\" (UID: \"bbe3541d-1e23-45a3-9fac-823f53e92044\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.465125 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.476273 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:04:53 crc kubenswrapper[4675]: W0320 16:04:53.477254 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59253b32_b908_48ed_bfb6_d3374fbcd40b.slice/crio-ce4b86ce13c74df730d52e4341928612607559be78ee1a107621b33fc75804f1 WatchSource:0}: Error finding container ce4b86ce13c74df730d52e4341928612607559be78ee1a107621b33fc75804f1: Status 404 returned error can't find the container with id ce4b86ce13c74df730d52e4341928612607559be78ee1a107621b33fc75804f1 Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.484608 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.500220 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt"] Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.502817 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.504280 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.525287 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.565626 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.576030 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rpvlc"] Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.580573 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbhf\" (UniqueName: \"kubernetes.io/projected/9ca3b7f0-d36c-487e-938c-da2d8781061a-kube-api-access-xdbhf\") pod \"collect-profiles-29567040-fncng\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.581016 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcvk\" (UniqueName: \"kubernetes.io/projected/d0bf4e3e-d741-4ff3-a0cc-6280d16cd533-kube-api-access-fbcvk\") pod \"migrator-59844c95c7-49qhm\" (UID: \"d0bf4e3e-d741-4ff3-a0cc-6280d16cd533\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.600704 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.612457 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/c6d2332a-bd88-45d7-8645-63778001dd65-kube-api-access-2xv82\") pod \"auto-csr-approver-29567044-wld22\" (UID: \"c6d2332a-bd88-45d7-8645-63778001dd65\") " pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.612889 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.620578 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.625736 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgpls\" (UniqueName: \"kubernetes.io/projected/d0dd2e63-5c15-409b-a537-4d92f9e5bd86-kube-api-access-qgpls\") pod \"packageserver-d55dfcdfc-27f97\" (UID: \"d0dd2e63-5c15-409b-a537-4d92f9e5bd86\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.626286 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.642582 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.647652 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7wr\" (UniqueName: \"kubernetes.io/projected/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-kube-api-access-xm7wr\") pod \"route-controller-manager-6576b87f9c-5g6nl\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.650132 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" event={"ID":"59253b32-b908-48ed-bfb6-d3374fbcd40b","Type":"ContainerStarted","Data":"ce4b86ce13c74df730d52e4341928612607559be78ee1a107621b33fc75804f1"} Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.654495 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" event={"ID":"a67fae4e-e87c-48bc-83e4-0bc553cf5904","Type":"ContainerStarted","Data":"14655e01594c9ac451dc2884dc00a7823d6421f46416c109ac19ab239259f001"} Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.667002 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" event={"ID":"78129d19-d0a7-404f-96a0-4096b7d7f375","Type":"ContainerStarted","Data":"b134e045094672394740ab5c2699631153f451a1b8169e864e35f4e5b59d4673"} Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.667075 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" event={"ID":"78129d19-d0a7-404f-96a0-4096b7d7f375","Type":"ContainerStarted","Data":"0cca714ed6a7ecf21a306bf71732a7ab80a976af826e9a4a82e47af84b3ff1b5"} Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.670181 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz6lt\" (UniqueName: \"kubernetes.io/projected/c311c63c-0f7e-4435-a2e3-fbc85a59594e-kube-api-access-lz6lt\") pod \"marketplace-operator-79b997595-gnrqz\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.670244 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" event={"ID":"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec","Type":"ContainerStarted","Data":"c5847fc2a0fd333c3e1b4e313943e8a44bdbfc72b0d60949b5755cd36f8f62f1"} Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.677699 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.683436 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksnnh\" (UniqueName: \"kubernetes.io/projected/bbaf20e3-9148-4012-aa70-0c8ba76e02c2-kube-api-access-ksnnh\") pod \"multus-admission-controller-857f4d67dd-dxwrc\" (UID: \"bbaf20e3-9148-4012-aa70-0c8ba76e02c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.686414 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.700002 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8vv\" (UniqueName: \"kubernetes.io/projected/29b966ff-c9bd-42b4-bf25-6f942fc2bb4d-kube-api-access-vd8vv\") pod \"package-server-manager-789f6589d5-dd98v\" (UID: \"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.730057 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4p4\" (UniqueName: \"kubernetes.io/projected/9ec164c8-422d-443c-aed4-b30304c06694-kube-api-access-cr4p4\") pod \"dns-default-f8nq8\" (UID: \"9ec164c8-422d-443c-aed4-b30304c06694\") " pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.744303 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d96c\" (UniqueName: \"kubernetes.io/projected/14695b26-4567-40f0-a892-25172bd0fb0a-kube-api-access-5d96c\") pod \"apiserver-7bbb656c7d-6v2jn\" (UID: \"14695b26-4567-40f0-a892-25172bd0fb0a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.758188 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j"] Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.772481 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvb9p\" (UniqueName: \"kubernetes.io/projected/1bebdfb7-34f5-4e90-b64e-c1442738c51d-kube-api-access-wvb9p\") pod \"catalog-operator-68c6474976-knk82\" (UID: \"1bebdfb7-34f5-4e90-b64e-c1442738c51d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.795279 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.802649 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvt6f\" (UniqueName: \"kubernetes.io/projected/ba4d60b3-764c-4378-ba52-23f712ab9eb0-kube-api-access-zvt6f\") pod \"router-default-5444994796-q9jhd\" (UID: \"ba4d60b3-764c-4378-ba52-23f712ab9eb0\") " pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.805827 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zdf\" (UniqueName: \"kubernetes.io/projected/ffe62543-215b-47d8-9e48-de4466ce84f2-kube-api-access-k7zdf\") pod \"service-ca-9c57cc56f-gx27v\" (UID: \"ffe62543-215b-47d8-9e48-de4466ce84f2\") " pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.826342 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mghc\" (UniqueName: \"kubernetes.io/projected/45ca71c6-ab6e-4f92-ba2f-88096793d64b-kube-api-access-9mghc\") pod \"olm-operator-6b444d44fb-qhnk6\" (UID: \"45ca71c6-ab6e-4f92-ba2f-88096793d64b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.841467 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkrx\" (UniqueName: \"kubernetes.io/projected/d8efdad2-e2fa-4003-bd03-117a399b9df0-kube-api-access-rvkrx\") pod \"machine-config-controller-84d6567774-9gz5w\" (UID: \"d8efdad2-e2fa-4003-bd03-117a399b9df0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.841878 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.847997 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wzl6m"] Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.849541 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8"] Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.848706 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.862933 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjw8b\" (UniqueName: \"kubernetes.io/projected/585603c7-dfdf-4343-a32d-500c6868137e-kube-api-access-wjw8b\") pod \"machine-config-operator-74547568cd-6xn2m\" (UID: \"585603c7-dfdf-4343-a32d-500c6868137e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.864069 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.876098 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.890289 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwfd\" (UniqueName: \"kubernetes.io/projected/5e0751b5-dda7-4346-bb7f-927d886a955b-kube-api-access-sxwfd\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzqrv\" (UID: \"5e0751b5-dda7-4346-bb7f-927d886a955b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.893079 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.914973 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.932342 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.941107 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.948440 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:53 crc kubenswrapper[4675]: I0320 16:04:53.988399 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.003706 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004353 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-tls\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004388 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004409 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-oauth-serving-cert\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-serving-cert\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004442 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-bound-sa-token\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004458 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-config\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004473 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-config\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004491 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15212a01-a933-41ec-96ad-d0fb79722f68-audit-dir\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004526 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-etcd-client\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004543 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-serving-cert\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004570 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6152960d-fbba-4874-9127-cdd83b1d9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms29d\" (UID: \"6152960d-fbba-4874-9127-cdd83b1d9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004590 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2ksw\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-kube-api-access-s2ksw\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004611 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004627 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-serving-cert\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004653 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlbd\" (UniqueName: \"kubernetes.io/projected/15212a01-a933-41ec-96ad-d0fb79722f68-kube-api-access-tjlbd\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004672 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004713 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004732 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-image-import-ca\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.004748 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-service-ca\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.018787 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.018956 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023326 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2hqx\" (UniqueName: \"kubernetes.io/projected/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-kube-api-access-b2hqx\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023400 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/15212a01-a933-41ec-96ad-d0fb79722f68-node-pullsecrets\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023427 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-etcd-serving-ca\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023515 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-config\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023533 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-trusted-ca-bundle\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023558 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-audit\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023614 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-encryption-config\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.023747 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wjz\" (UniqueName: \"kubernetes.io/projected/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-kube-api-access-l9wjz\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.026530 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.526514187 +0000 UTC m=+214.560143724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.031426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-certificates\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.035934 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-trusted-ca\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.036107 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-oauth-config\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.036214 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kktbc\" (UniqueName: \"kubernetes.io/projected/6152960d-fbba-4874-9127-cdd83b1d9d7a-kube-api-access-kktbc\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms29d\" (UID: \"6152960d-fbba-4874-9127-cdd83b1d9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: W0320 16:04:54.034469 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceebe5f6_3cee_41d3_ab16_9d562cff84f8.slice/crio-a09868188f55648f2e14eef335db58e6a6af9d766dfd445c40504c5e081a5493 WatchSource:0}: Error finding container a09868188f55648f2e14eef335db58e6a6af9d766dfd445c40504c5e081a5493: Status 404 returned error can't find the container with id a09868188f55648f2e14eef335db58e6a6af9d766dfd445c40504c5e081a5493 Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.057671 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k6cqj"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.068363 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.078881 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ccrx9"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.079226 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.137079 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.137476 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.637425969 +0000 UTC m=+214.671055506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.138126 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2ksw\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-kube-api-access-s2ksw\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.138190 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-plugins-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139045 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139080 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-serving-cert\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139149 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlbd\" (UniqueName: \"kubernetes.io/projected/15212a01-a933-41ec-96ad-d0fb79722f68-kube-api-access-tjlbd\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139187 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139210 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139300 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d888de-19f8-4e57-80f4-f3831177b6fd-cert\") pod \"ingress-canary-699d9\" (UID: \"a8d888de-19f8-4e57-80f4-f3831177b6fd\") " pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.139369 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ed287b4-75f1-44c6-bbb3-529a676c7e12-node-bootstrap-token\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141225 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141378 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141533 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-csi-data-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141691 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-image-import-ca\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141724 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwzm\" (UniqueName: \"kubernetes.io/projected/a8d888de-19f8-4e57-80f4-f3831177b6fd-kube-api-access-sgwzm\") pod \"ingress-canary-699d9\" (UID: \"a8d888de-19f8-4e57-80f4-f3831177b6fd\") " pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141806 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-service-ca\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141846 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2hqx\" (UniqueName: \"kubernetes.io/projected/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-kube-api-access-b2hqx\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141889 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/15212a01-a933-41ec-96ad-d0fb79722f68-node-pullsecrets\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141916 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-etcd-serving-ca\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141951 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-config\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.141974 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-trusted-ca-bundle\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142001 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-audit\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142036 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-encryption-config\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142078 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hc5w\" (UniqueName: \"kubernetes.io/projected/5ed287b4-75f1-44c6-bbb3-529a676c7e12-kube-api-access-9hc5w\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142109 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-registration-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142146 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wjz\" (UniqueName: \"kubernetes.io/projected/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-kube-api-access-l9wjz\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142177 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ed287b4-75f1-44c6-bbb3-529a676c7e12-certs\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142211 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-certificates\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142241 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-trusted-ca\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142265 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-oauth-config\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142297 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kktbc\" (UniqueName: \"kubernetes.io/projected/6152960d-fbba-4874-9127-cdd83b1d9d7a-kube-api-access-kktbc\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms29d\" (UID: \"6152960d-fbba-4874-9127-cdd83b1d9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142434 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-tls\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142539 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142615 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-socket-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-oauth-serving-cert\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142726 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-serving-cert\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142757 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-bound-sa-token\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142801 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-config\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142826 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-config\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142897 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15212a01-a933-41ec-96ad-d0fb79722f68-audit-dir\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.142998 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-mountpoint-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.143052 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-etcd-client\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.143134 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97jm6\" (UniqueName: \"kubernetes.io/projected/74e03833-c657-4de8-935a-fd7b8580d62b-kube-api-access-97jm6\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.143184 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-serving-cert\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.143222 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6152960d-fbba-4874-9127-cdd83b1d9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms29d\" (UID: \"6152960d-fbba-4874-9127-cdd83b1d9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.144599 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-image-import-ca\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.145742 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-config\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.146867 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15212a01-a933-41ec-96ad-d0fb79722f68-audit-dir\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.147985 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-oauth-serving-cert\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.147987 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-trusted-ca-bundle\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.148050 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/15212a01-a933-41ec-96ad-d0fb79722f68-node-pullsecrets\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.148523 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-etcd-serving-ca\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.148921 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-service-ca\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.149570 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-config\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.149657 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.649639212 +0000 UTC m=+214.683268929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.149858 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.151309 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-audit\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.151841 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-certificates\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.153961 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-serving-cert\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.155972 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15212a01-a933-41ec-96ad-d0fb79722f68-config\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.160749 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-trusted-ca\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.162594 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-tls\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.163528 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6152960d-fbba-4874-9127-cdd83b1d9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms29d\" (UID: \"6152960d-fbba-4874-9127-cdd83b1d9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.164460 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-serving-cert\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.165076 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-oauth-config\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.168107 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-etcd-client\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.168875 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-encryption-config\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.169275 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15212a01-a933-41ec-96ad-d0fb79722f68-serving-cert\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.181203 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vl96h"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.210025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2ksw\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-kube-api-access-s2ksw\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.210469 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.234918 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlbd\" (UniqueName: \"kubernetes.io/projected/15212a01-a933-41ec-96ad-d0fb79722f68-kube-api-access-tjlbd\") pod \"apiserver-76f77b778f-qdwzs\" (UID: \"15212a01-a933-41ec-96ad-d0fb79722f68\") " pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.235923 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kktbc\" (UniqueName: \"kubernetes.io/projected/6152960d-fbba-4874-9127-cdd83b1d9d7a-kube-api-access-kktbc\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms29d\" (UID: \"6152960d-fbba-4874-9127-cdd83b1d9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.244923 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.245119 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.745102459 +0000 UTC m=+214.778731996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245471 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hc5w\" (UniqueName: \"kubernetes.io/projected/5ed287b4-75f1-44c6-bbb3-529a676c7e12-kube-api-access-9hc5w\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245510 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-registration-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245545 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ed287b4-75f1-44c6-bbb3-529a676c7e12-certs\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245604 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245627 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-socket-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245714 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-mountpoint-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245820 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97jm6\" (UniqueName: \"kubernetes.io/projected/74e03833-c657-4de8-935a-fd7b8580d62b-kube-api-access-97jm6\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245898 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-plugins-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245999 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d888de-19f8-4e57-80f4-f3831177b6fd-cert\") pod \"ingress-canary-699d9\" (UID: \"a8d888de-19f8-4e57-80f4-f3831177b6fd\") " pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.246071 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ed287b4-75f1-44c6-bbb3-529a676c7e12-node-bootstrap-token\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.246140 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-csi-data-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.246216 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwzm\" (UniqueName: \"kubernetes.io/projected/a8d888de-19f8-4e57-80f4-f3831177b6fd-kube-api-access-sgwzm\") pod \"ingress-canary-699d9\" (UID: \"a8d888de-19f8-4e57-80f4-f3831177b6fd\") " pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.245727 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-mountpoint-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.246575 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-socket-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.246662 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-plugins-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.247454 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.747419395 +0000 UTC m=+214.781048932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.248146 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-registration-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.250145 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/74e03833-c657-4de8-935a-fd7b8580d62b-csi-data-dir\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.255551 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-bound-sa-token\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.256304 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ed287b4-75f1-44c6-bbb3-529a676c7e12-certs\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.260642 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ed287b4-75f1-44c6-bbb3-529a676c7e12-node-bootstrap-token\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.266302 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d888de-19f8-4e57-80f4-f3831177b6fd-cert\") pod \"ingress-canary-699d9\" (UID: \"a8d888de-19f8-4e57-80f4-f3831177b6fd\") " pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: W0320 16:04:54.276310 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba4d60b3_764c_4378_ba52_23f712ab9eb0.slice/crio-dcac04f028e23c5d69563f6605416d245286b9274db1515923872b3ab98a9b0a WatchSource:0}: Error finding container dcac04f028e23c5d69563f6605416d245286b9274db1515923872b3ab98a9b0a: Status 404 returned error can't find the container with id dcac04f028e23c5d69563f6605416d245286b9274db1515923872b3ab98a9b0a Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.287373 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wjz\" (UniqueName: \"kubernetes.io/projected/a30f4c73-2cdf-4fa4-b870-1bce73d3ceed-kube-api-access-l9wjz\") pod \"service-ca-operator-777779d784-vh65f\" (UID: \"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.288346 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2hqx\" (UniqueName: \"kubernetes.io/projected/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-kube-api-access-b2hqx\") pod \"console-f9d7485db-cnrtx\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.347411 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.347906 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.847889327 +0000 UTC m=+214.881518864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.353090 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hc5w\" (UniqueName: \"kubernetes.io/projected/5ed287b4-75f1-44c6-bbb3-529a676c7e12-kube-api-access-9hc5w\") pod \"machine-config-server-b5d8k\" (UID: \"5ed287b4-75f1-44c6-bbb3-529a676c7e12\") " pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.356168 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97jm6\" (UniqueName: \"kubernetes.io/projected/74e03833-c657-4de8-935a-fd7b8580d62b-kube-api-access-97jm6\") pod \"csi-hostpathplugin-qn4j5\" (UID: \"74e03833-c657-4de8-935a-fd7b8580d62b\") " pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.361560 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-htpxv"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.365412 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.369641 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.377591 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwzm\" (UniqueName: \"kubernetes.io/projected/a8d888de-19f8-4e57-80f4-f3831177b6fd-kube-api-access-sgwzm\") pod \"ingress-canary-699d9\" (UID: \"a8d888de-19f8-4e57-80f4-f3831177b6fd\") " pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.413441 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.418997 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.449669 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.450038 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:54.950025106 +0000 UTC m=+214.983654643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.517902 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.551099 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.551383 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.051356812 +0000 UTC m=+215.084986339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.557174 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.615619 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-b5d8k" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.621902 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.637135 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-699d9" Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.653754 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.654114 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.154098478 +0000 UTC m=+215.187728095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.754561 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.755176 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.255043813 +0000 UTC m=+215.288673350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.806060 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" event={"ID":"9040e8e8-dcda-4de9-b015-0cc0e947858d","Type":"ContainerStarted","Data":"c46dbff4f1d0e3bbe7658f53c1b7c6a300442924d1dd2f8034c025c253e8249a"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.807350 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" event={"ID":"08f5035b-6ecf-49dc-8317-d40e5675a472","Type":"ContainerStarted","Data":"bcc2e86d76b94c566a617f1ff9f39ce0b62e70dcb04cd74c009028c1b3ac7665"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.836115 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" event={"ID":"ceebe5f6-3cee-41d3-ab16-9d562cff84f8","Type":"ContainerStarted","Data":"a09868188f55648f2e14eef335db58e6a6af9d766dfd445c40504c5e081a5493"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.839117 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-wld22"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.843548 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9tmzc"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.860350 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.860926 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.36090922 +0000 UTC m=+215.394538757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.867384 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng"] Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.878264 4675 generic.go:334] "Generic (PLEG): container finished" podID="a67fae4e-e87c-48bc-83e4-0bc553cf5904" containerID="0aaede9e535aa08e2e35d758ed1895ea0c3cbd8a820d052ef3153356e2409d0f" exitCode=0 Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.878355 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" event={"ID":"a67fae4e-e87c-48bc-83e4-0bc553cf5904","Type":"ContainerDied","Data":"0aaede9e535aa08e2e35d758ed1895ea0c3cbd8a820d052ef3153356e2409d0f"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.879960 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.897143 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" event={"ID":"fec3b15e-e8c4-4f4b-9153-4014bbf77c86","Type":"ContainerStarted","Data":"334261ad375513401a665cc4ffd7f267a0dbb3741b69f1ae922cfd68630b118f"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.920207 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q9jhd" event={"ID":"ba4d60b3-764c-4378-ba52-23f712ab9eb0","Type":"ContainerStarted","Data":"dcac04f028e23c5d69563f6605416d245286b9274db1515923872b3ab98a9b0a"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.926137 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" event={"ID":"d80adb62-f7cc-4c49-98da-7a1167881907","Type":"ContainerStarted","Data":"2d77abdcd65e0edbbd5eb479d4e553a5a9857bfec5fbc5ff7d7dfb5f78e41f90"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.926209 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" event={"ID":"d80adb62-f7cc-4c49-98da-7a1167881907","Type":"ContainerStarted","Data":"73a4f62a804c7073098ddb576a648e89a239821ea2dfdfcb83b8b97b8ed2196b"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.942224 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" event={"ID":"a5fb2328-2088-4d1f-a731-dc276b678a94","Type":"ContainerStarted","Data":"c1a88e5e0b7e160136dbe0a06cb34e8c49727dc821bcb51af3212625972fc2fb"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.960296 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" event={"ID":"78129d19-d0a7-404f-96a0-4096b7d7f375","Type":"ContainerStarted","Data":"0ab35732deedf96a11f693c28732c30232507605a4c119f4d5612ed635b7e65b"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.961504 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.961663 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.461632907 +0000 UTC m=+215.495262454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.961872 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:54 crc kubenswrapper[4675]: E0320 16:04:54.963313 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.463301696 +0000 UTC m=+215.496931233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.976904 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" event={"ID":"1b98a434-f0d3-415c-adbb-0ff614dac3e4","Type":"ContainerStarted","Data":"26c107ae2bb169db6874ce59a5b1c23d5f5bcffb7a75a90bdec5876a9c7ff64a"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.984908 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" event={"ID":"d1e1fc22-8efc-4ca9-a3c4-4736385a1138","Type":"ContainerStarted","Data":"036a2df546fc88a80f2978501f91654cd886398aec0545c340269ea1d4832d02"} Mar 20 16:04:54 crc kubenswrapper[4675]: I0320 16:04:54.992692 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" event={"ID":"bf42be81-82f9-47c4-a968-1c048e52d4f3","Type":"ContainerStarted","Data":"e45679f4c0fc75af7d7f11d5a0219e7d89066b352a06a6fc43f770661664488a"} Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.000334 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ccrx9" event={"ID":"3074e872-f732-42c6-b7c3-6a88e0f5b81c","Type":"ContainerStarted","Data":"ac28906c0ce3b3a2565e9892f1c6c9d2ab38b7c0cf20b72e765f1f9d9e45d386"} Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.001664 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.004251 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" event={"ID":"6d96a004-baaa-4e15-af6b-e25b8e503958","Type":"ContainerStarted","Data":"368a3b9d03252877173c950306f2f70c23e16356a5cae37c6afb164e25dff21a"} Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.004811 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.008912 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" event={"ID":"d0bf4e3e-d741-4ff3-a0cc-6280d16cd533","Type":"ContainerStarted","Data":"89a6517bd3334673341bb15ffd3ca6ba97ec0c457ddd0c0f9cd7bd265e5a9b2f"} Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.018276 4675 patch_prober.go:28] interesting pod/console-operator-58897d9998-k6cqj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.018340 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" podUID="6d96a004-baaa-4e15-af6b-e25b8e503958" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.020023 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.022453 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-ccrx9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.022497 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ccrx9" podUID="3074e872-f732-42c6-b7c3-6a88e0f5b81c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.023453 4675 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rpvlc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.023474 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" podUID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.024545 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" event={"ID":"59253b32-b908-48ed-bfb6-d3374fbcd40b","Type":"ContainerStarted","Data":"7de2f6e29598a5787e4df34a4fa04bc6565a724c7db13daf7d6afc5667745924"} Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.025476 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.027043 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" event={"ID":"bbe3541d-1e23-45a3-9fac-823f53e92044","Type":"ContainerStarted","Data":"0b18e26e03cacefa19b77ece0e38f9d849ea2bf3ee8452772e10558b35334657"} Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.031543 4675 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ds6wn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.031592 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" podUID="59253b32-b908-48ed-bfb6-d3374fbcd40b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.070922 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.073204 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.573179668 +0000 UTC m=+215.606809245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.076716 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.094002 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.593983009 +0000 UTC m=+215.627612546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.190127 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.190312 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.69027921 +0000 UTC m=+215.723908747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.190430 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.190667 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.690659421 +0000 UTC m=+215.724288958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.228982 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.259038 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.259221 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dxwrc"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.291138 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.291595 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.791563024 +0000 UTC m=+215.825192561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.293957 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.299211 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.308502 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f8nq8"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.309972 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gnrqz"] Mar 20 16:04:55 crc kubenswrapper[4675]: W0320 16:04:55.337528 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dd2e63_5c15_409b_a537_4d92f9e5bd86.slice/crio-fa8b46260dd3af77cdd4347fdcf181c37ec42b41e979ecc9d5f2d2f0d682d493 WatchSource:0}: Error finding container fa8b46260dd3af77cdd4347fdcf181c37ec42b41e979ecc9d5f2d2f0d682d493: Status 404 returned error can't find the container with id fa8b46260dd3af77cdd4347fdcf181c37ec42b41e979ecc9d5f2d2f0d682d493 Mar 20 16:04:55 crc kubenswrapper[4675]: W0320 16:04:55.337903 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec164c8_422d_443c_aed4_b30304c06694.slice/crio-4627a59162e4607d1dc7268080ca89ee8d76342752b71ea83e7c36e41c36248c WatchSource:0}: Error finding container 4627a59162e4607d1dc7268080ca89ee8d76342752b71ea83e7c36e41c36248c: Status 404 returned error can't find the container with id 4627a59162e4607d1dc7268080ca89ee8d76342752b71ea83e7c36e41c36248c Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.346151 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.363908 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.383638 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.385044 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.392544 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.392913 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:55.89289731 +0000 UTC m=+215.926526847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.483228 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.483327 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d"] Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.501456 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.001422194 +0000 UTC m=+216.035051731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.501157 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.503531 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.504786 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.00475459 +0000 UTC m=+216.038384137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.526164 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.528053 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gx27v"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.533050 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qn4j5"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.533098 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qdwzs"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.537508 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cnrtx"] Mar 20 16:04:55 crc kubenswrapper[4675]: W0320 16:04:55.541073 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6152960d_fbba_4874_9127_cdd83b1d9d7a.slice/crio-ed36b937f9febb22bb6432b30d60afa845e300338156552537aee2b8f4b8df24 WatchSource:0}: Error finding container ed36b937f9febb22bb6432b30d60afa845e300338156552537aee2b8f4b8df24: Status 404 returned error can't find the container with id ed36b937f9febb22bb6432b30d60afa845e300338156552537aee2b8f4b8df24 Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.547457 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vh65f"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.606784 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.608880 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.108853286 +0000 UTC m=+216.142482883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.642736 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-699d9"] Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.666891 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" podStartSLOduration=154.666870571 podStartE2EDuration="2m34.666870571s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:55.665400459 +0000 UTC m=+215.699029996" watchObservedRunningTime="2026-03-20 16:04:55.666870571 +0000 UTC m=+215.700500108" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.708912 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.709316 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.209299826 +0000 UTC m=+216.242929363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.713028 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" podStartSLOduration=154.712998283 podStartE2EDuration="2m34.712998283s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:55.707805953 +0000 UTC m=+215.741435500" watchObservedRunningTime="2026-03-20 16:04:55.712998283 +0000 UTC m=+215.746627820" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.745441 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hhqgm" podStartSLOduration=154.745425009 podStartE2EDuration="2m34.745425009s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:55.744896104 +0000 UTC m=+215.778525651" watchObservedRunningTime="2026-03-20 16:04:55.745425009 +0000 UTC m=+215.779054546" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.810462 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.810686 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.310644283 +0000 UTC m=+216.344273820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.811483 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.812008 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.311997812 +0000 UTC m=+216.345627349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.815609 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" podStartSLOduration=154.815588005 podStartE2EDuration="2m34.815588005s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:55.784276551 +0000 UTC m=+215.817906088" watchObservedRunningTime="2026-03-20 16:04:55.815588005 +0000 UTC m=+215.849217542" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.912487 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:55 crc kubenswrapper[4675]: E0320 16:04:55.912995 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.412974367 +0000 UTC m=+216.446603894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.914521 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" podStartSLOduration=154.914499631 podStartE2EDuration="2m34.914499631s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:55.910312971 +0000 UTC m=+215.943942508" watchObservedRunningTime="2026-03-20 16:04:55.914499631 +0000 UTC m=+215.948129168" Mar 20 16:04:55 crc kubenswrapper[4675]: I0320 16:04:55.946999 4675 ???:1] "http: TLS handshake error from 192.168.126.11:58978: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.016163 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.016635 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.516596139 +0000 UTC m=+216.550225676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.038987 4675 ???:1] "http: TLS handshake error from 192.168.126.11:58984: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.040266 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ccrx9" podStartSLOduration=155.040245482 podStartE2EDuration="2m35.040245482s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:55.991093343 +0000 UTC m=+216.024722880" watchObservedRunningTime="2026-03-20 16:04:56.040245482 +0000 UTC m=+216.073875019" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.045678 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" event={"ID":"bbe3541d-1e23-45a3-9fac-823f53e92044","Type":"ContainerStarted","Data":"df141aa12205bd81e4485f8f6e30cfb5d44d354c9d9307905fd05bef68721fc6"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.049202 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" event={"ID":"6f6b57cd-8b1f-4d89-9ff3-a00efc202135","Type":"ContainerStarted","Data":"2f11a176e50e61da7123cbb4c15aaab55e99d3aa958ac6b61e71c7e1b34aaa94"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.067612 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmf9j" podStartSLOduration=155.067591692 podStartE2EDuration="2m35.067591692s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.065714778 +0000 UTC m=+216.099344315" watchObservedRunningTime="2026-03-20 16:04:56.067591692 +0000 UTC m=+216.101221229" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.067932 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" event={"ID":"d0bf4e3e-d741-4ff3-a0cc-6280d16cd533","Type":"ContainerStarted","Data":"0526615a3feef04bd291774aa98f2924921af4e59181f0f027099947f65ebaf1"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.067981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" event={"ID":"d0bf4e3e-d741-4ff3-a0cc-6280d16cd533","Type":"ContainerStarted","Data":"303236e5e70614f4242914c788d3e11485ef4b414c223d74249439d8d2ed5f11"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.078302 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" event={"ID":"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed","Type":"ContainerStarted","Data":"0e1673a6f459d029fe2639e63ca0cb57063f08d412095ff9891c97b39fb4dfba"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.101483 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dq9pl" podStartSLOduration=155.10146271 podStartE2EDuration="2m35.10146271s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.099173044 +0000 UTC m=+216.132802591" watchObservedRunningTime="2026-03-20 16:04:56.10146271 +0000 UTC m=+216.135092247" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.101664 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" event={"ID":"9040e8e8-dcda-4de9-b015-0cc0e947858d","Type":"ContainerStarted","Data":"0b98733dd99a554d811a9c84a50ffd5e1139c8a04bab12ce4b73b1d9575398d8"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.121244 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.121478 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.621431707 +0000 UTC m=+216.655061244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.121734 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.124628 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.624584498 +0000 UTC m=+216.658214215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.144062 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" event={"ID":"ffe62543-215b-47d8-9e48-de4466ce84f2","Type":"ContainerStarted","Data":"c1c0f6fd6c6eba3b7a0c81fdc63110c87c2586998997bdb71b776aab0069df63"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.167242 4675 ???:1] "http: TLS handshake error from 192.168.126.11:58990: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.169492 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-49qhm" podStartSLOduration=155.169470224 podStartE2EDuration="2m35.169470224s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.144370269 +0000 UTC m=+216.177999806" watchObservedRunningTime="2026-03-20 16:04:56.169470224 +0000 UTC m=+216.203099771" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.204300 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gd5j8" event={"ID":"1b98a434-f0d3-415c-adbb-0ff614dac3e4","Type":"ContainerStarted","Data":"a78dfc28e34bf6b82a6ed636500b699ea460430e73a6995af9bc5a09fd7974e2"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.216313 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-vl96h" podStartSLOduration=155.216263625 podStartE2EDuration="2m35.216263625s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.204847995 +0000 UTC m=+216.238477552" watchObservedRunningTime="2026-03-20 16:04:56.216263625 +0000 UTC m=+216.249893162" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.227464 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.228484 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.728429156 +0000 UTC m=+216.762058693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.253615 4675 ???:1] "http: TLS handshake error from 192.168.126.11:58992: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.266508 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" event={"ID":"08f5035b-6ecf-49dc-8317-d40e5675a472","Type":"ContainerStarted","Data":"82bdf6767f9049c18f62e01433c2144cd9a86bd4c2d81f8b73b31363959e67ef"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.266564 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" event={"ID":"08f5035b-6ecf-49dc-8317-d40e5675a472","Type":"ContainerStarted","Data":"88d2a8850169bd08ff640dba8ddd685ab149df35109ac1f7c61ddfad647c17d0"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.280359 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" event={"ID":"ceebe5f6-3cee-41d3-ab16-9d562cff84f8","Type":"ContainerStarted","Data":"361f4b5e082032c5f06b7cbb647918f49478250194203bfaefc15d8f3750d246"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.280410 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" event={"ID":"ceebe5f6-3cee-41d3-ab16-9d562cff84f8","Type":"ContainerStarted","Data":"b2cd6f25d517baabf5c85fcdd1bdbe2e8aed1dd63063f86fc6d112524680c1c0"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.283061 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" event={"ID":"bbaf20e3-9148-4012-aa70-0c8ba76e02c2","Type":"ContainerStarted","Data":"e1da8fdbebd4f93634b79f7e2eee79431fb0f56ba2eddc902a5abe815235b8c2"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.284117 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" event={"ID":"74e03833-c657-4de8-935a-fd7b8580d62b","Type":"ContainerStarted","Data":"7765eed0ac9cdd9bf0d93267624da3cde2b2aaa357fab71cc4a0180af28eeab6"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.295255 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ccrx9" event={"ID":"3074e872-f732-42c6-b7c3-6a88e0f5b81c","Type":"ContainerStarted","Data":"2e58dd307f2333a9b8e72a3cc947e16145f315323f06b99d3d96177075627bdc"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.296274 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-ccrx9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.296375 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ccrx9" podUID="3074e872-f732-42c6-b7c3-6a88e0f5b81c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.306923 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sdxzc" podStartSLOduration=155.306906452 podStartE2EDuration="2m35.306906452s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.297635575 +0000 UTC m=+216.331265122" watchObservedRunningTime="2026-03-20 16:04:56.306906452 +0000 UTC m=+216.340535989" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.318501 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" event={"ID":"d1e1fc22-8efc-4ca9-a3c4-4736385a1138","Type":"ContainerStarted","Data":"efd57cf78b1492d0c97a4ae5ca417ffaffc1aafbdfd86ad12afaccfa078ae1e1"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.331613 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.332752 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.832738538 +0000 UTC m=+216.866368075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.348400 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" event={"ID":"57069867-4c85-4262-ac54-05d5257ad81b","Type":"ContainerStarted","Data":"5bac067bb46cc239e40b6aef247327446fa4e111e8a05f258ac5cf5bcb3e7d32"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.349336 4675 ???:1] "http: TLS handshake error from 192.168.126.11:59006: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.359539 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wzl6m" podStartSLOduration=155.359521182 podStartE2EDuration="2m35.359521182s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.330559775 +0000 UTC m=+216.364189312" watchObservedRunningTime="2026-03-20 16:04:56.359521182 +0000 UTC m=+216.393150709" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.360513 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-649kg" podStartSLOduration=155.36050707 podStartE2EDuration="2m35.36050707s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.359984105 +0000 UTC m=+216.393613642" watchObservedRunningTime="2026-03-20 16:04:56.36050707 +0000 UTC m=+216.394136607" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.374911 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" event={"ID":"9ca3b7f0-d36c-487e-938c-da2d8781061a","Type":"ContainerStarted","Data":"c8d81c49af6e574ff8e1c069754226c6d21a33d0af447a96f5b2c3c3b23d8944"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.374966 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" event={"ID":"9ca3b7f0-d36c-487e-938c-da2d8781061a","Type":"ContainerStarted","Data":"aa16fd362bd2b22fc7d1d68114a67fdd9a37181e4c2f3d9f0ec42275ba58ac75"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.403324 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" podStartSLOduration=155.403310536 podStartE2EDuration="2m35.403310536s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.402382119 +0000 UTC m=+216.436011656" watchObservedRunningTime="2026-03-20 16:04:56.403310536 +0000 UTC m=+216.436940073" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.413581 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cnrtx" event={"ID":"ccabe656-71a5-4e5b-b5f8-093e1b38f62c","Type":"ContainerStarted","Data":"a1f6de4fc96a8199548db60237b7f23998d843007478d92bdbd8d2fe84c94c4e"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.442204 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.443299 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:56.94328239 +0000 UTC m=+216.976911937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.446478 4675 ???:1] "http: TLS handshake error from 192.168.126.11:59020: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.475496 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" podStartSLOduration=155.4754751 podStartE2EDuration="2m35.4754751s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.441422506 +0000 UTC m=+216.475052043" watchObservedRunningTime="2026-03-20 16:04:56.4754751 +0000 UTC m=+216.509104657" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.490469 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-b5d8k" event={"ID":"5ed287b4-75f1-44c6-bbb3-529a676c7e12","Type":"ContainerStarted","Data":"09e5b2ff52e715ca478f483c3eca418d24c601d4126e7b843c7d0d5ff47c1444"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.490519 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-b5d8k" event={"ID":"5ed287b4-75f1-44c6-bbb3-529a676c7e12","Type":"ContainerStarted","Data":"2bc8a5268ddea9cf682a5dc2c9add99e5caed436d774bbc049142e005e39bd55"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.525450 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" event={"ID":"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec","Type":"ContainerStarted","Data":"5d22243f7c7b9a95adb578c6134b45e34351d4b8ebe7efb6b284d74cf20faaa0"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.545308 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.546529 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.046515261 +0000 UTC m=+217.080144788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.553053 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" event={"ID":"a5fb2328-2088-4d1f-a731-dc276b678a94","Type":"ContainerStarted","Data":"bc628d40db92faa482f492522fd34d85719c02fe819da8e8e2bd75d75d6fe3c6"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.595473 4675 ???:1] "http: TLS handshake error from 192.168.126.11:59030: no serving certificate available for the kubelet" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.595648 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-cnrtx" podStartSLOduration=155.595627429 podStartE2EDuration="2m35.595627429s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.478876548 +0000 UTC m=+216.512506095" watchObservedRunningTime="2026-03-20 16:04:56.595627429 +0000 UTC m=+216.629256966" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.595960 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-b5d8k" podStartSLOduration=5.595955068 podStartE2EDuration="5.595955068s" podCreationTimestamp="2026-03-20 16:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.595409603 +0000 UTC m=+216.629039140" watchObservedRunningTime="2026-03-20 16:04:56.595955068 +0000 UTC m=+216.629584605" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.603126 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" event={"ID":"585603c7-dfdf-4343-a32d-500c6868137e","Type":"ContainerStarted","Data":"4e13118f2d43ab764b19409ac5aa01acad55d813e8040730496579fbd884bd59"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.647452 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.648933 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.148918528 +0000 UTC m=+217.182548055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.738681 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" event={"ID":"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d","Type":"ContainerStarted","Data":"5f77c79610a3220f424d4382a1b10b679026a98af711ba94ce100a4a6e8cb98c"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.738727 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" event={"ID":"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d","Type":"ContainerStarted","Data":"f93314aef433c8cd8e0db96d5b698e9cbec90727a109367652749c0faf423a8b"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.738739 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" event={"ID":"14695b26-4567-40f0-a892-25172bd0fb0a","Type":"ContainerStarted","Data":"b76e796d9c5eedc51fe88ba35590950c2b2547c9d880c78da84ff006d816ffb4"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.754063 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" event={"ID":"bf42be81-82f9-47c4-a968-1c048e52d4f3","Type":"ContainerStarted","Data":"dbfd86060dc1eecfe7a97e479a9faf862cb6f8b1d2bcd057fa103126aba5e966"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.755591 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.755917 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.255907247 +0000 UTC m=+217.289536784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.769361 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" event={"ID":"d0dd2e63-5c15-409b-a537-4d92f9e5bd86","Type":"ContainerStarted","Data":"fa8b46260dd3af77cdd4347fdcf181c37ec42b41e979ecc9d5f2d2f0d682d493"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.770016 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.773612 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" event={"ID":"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0","Type":"ContainerStarted","Data":"b8d175f1dae83d8c9a886f37def64d22c71b0baa39609e73f1b663966733019e"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.773651 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" event={"ID":"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0","Type":"ContainerStarted","Data":"cf14a3426686059a21a564506388b6d46e9a4fffadf11db12f79d669ed84a97e"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.778020 4675 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-27f97 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.778067 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" podUID="d0dd2e63-5c15-409b-a537-4d92f9e5bd86" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.783184 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" event={"ID":"5e0751b5-dda7-4346-bb7f-927d886a955b","Type":"ContainerStarted","Data":"f6693c6a56e7e4bb80dde1d5acb3722f662d33dae0fed5362097f884ae5027e5"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.783238 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" event={"ID":"5e0751b5-dda7-4346-bb7f-927d886a955b","Type":"ContainerStarted","Data":"1450c26199307f9dbe265ac088940d2b2ffc9a91b37417d8932cc26bd7018caf"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.846660 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" event={"ID":"a67fae4e-e87c-48bc-83e4-0bc553cf5904","Type":"ContainerStarted","Data":"82e94a46d3637af71c1b226a6662058815d6e269f362c0b8abdacbddf4183ed8"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.847530 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.851492 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-htpxv" podStartSLOduration=155.851474287 podStartE2EDuration="2m35.851474287s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.669059549 +0000 UTC m=+216.702689096" watchObservedRunningTime="2026-03-20 16:04:56.851474287 +0000 UTC m=+216.885103824" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.856372 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.859322 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.359306923 +0000 UTC m=+217.392936460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.882886 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jmncz" podStartSLOduration=155.882869663 podStartE2EDuration="2m35.882869663s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.852421564 +0000 UTC m=+216.886051101" watchObservedRunningTime="2026-03-20 16:04:56.882869663 +0000 UTC m=+216.916499200" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.905954 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" podStartSLOduration=155.905938689 podStartE2EDuration="2m35.905938689s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.880205026 +0000 UTC m=+216.913834563" watchObservedRunningTime="2026-03-20 16:04:56.905938689 +0000 UTC m=+216.939568216" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.907159 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzqrv" podStartSLOduration=155.907153584 podStartE2EDuration="2m35.907153584s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.905051534 +0000 UTC m=+216.938681081" watchObservedRunningTime="2026-03-20 16:04:56.907153584 +0000 UTC m=+216.940783121" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.934245 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.948623 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" event={"ID":"c311c63c-0f7e-4435-a2e3-fbc85a59594e","Type":"ContainerStarted","Data":"b01840b87a401000182470eba216332e1a267fdc769e78e6bb5f9e53cda67173"} Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.948621 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" podStartSLOduration=155.948604141 podStartE2EDuration="2m35.948604141s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:56.948052755 +0000 UTC m=+216.981682302" watchObservedRunningTime="2026-03-20 16:04:56.948604141 +0000 UTC m=+216.982233678" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.949534 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.959752 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:56 crc kubenswrapper[4675]: E0320 16:04:56.960578 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.460567737 +0000 UTC m=+217.494197274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.968671 4675 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gnrqz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 16:04:56 crc kubenswrapper[4675]: I0320 16:04:56.968718 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.005604 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" event={"ID":"fec3b15e-e8c4-4f4b-9153-4014bbf77c86","Type":"ContainerStarted","Data":"7972a5bc8e7081ca41827d222c7801b0d338d1646a76ee3cb32abf881f1045df"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.005659 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" event={"ID":"fec3b15e-e8c4-4f4b-9153-4014bbf77c86","Type":"ContainerStarted","Data":"9787698b34fabe63efb3ec988e54e919741e1cd687ab39c47017db67170c35a2"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.005994 4675 ???:1] "http: TLS handshake error from 192.168.126.11:59046: no serving certificate available for the kubelet" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.044787 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" event={"ID":"1bebdfb7-34f5-4e90-b64e-c1442738c51d","Type":"ContainerStarted","Data":"5d795c1c2959e80c6391637b470ba71bea01bda88e4848ca5386ff381611babd"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.092323 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.095452 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-699d9" event={"ID":"a8d888de-19f8-4e57-80f4-f3831177b6fd","Type":"ContainerStarted","Data":"744efa6dcb9e19678977542befb50506099d3e07a00ec9b6f301e3a4273627c6"} Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.096749 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.596728238 +0000 UTC m=+217.630357775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.110828 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.112680 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.612668008 +0000 UTC m=+217.646297545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.121678 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f8nq8" event={"ID":"9ec164c8-422d-443c-aed4-b30304c06694","Type":"ContainerStarted","Data":"4627a59162e4607d1dc7268080ca89ee8d76342752b71ea83e7c36e41c36248c"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.121512 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gcwfs" podStartSLOduration=156.12136243 podStartE2EDuration="2m36.12136243s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:57.120588087 +0000 UTC m=+217.154217624" watchObservedRunningTime="2026-03-20 16:04:57.12136243 +0000 UTC m=+217.154991967" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.139128 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" podStartSLOduration=156.139090581 podStartE2EDuration="2m36.139090581s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:57.092229858 +0000 UTC m=+217.125859405" watchObservedRunningTime="2026-03-20 16:04:57.139090581 +0000 UTC m=+217.172720118" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.139350 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" event={"ID":"45ca71c6-ab6e-4f92-ba2f-88096793d64b","Type":"ContainerStarted","Data":"dd7e7d73fae2524ad843b443586d4954eab86748dbc98af4089e1b8915283eca"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.158449 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.166199 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" event={"ID":"6152960d-fbba-4874-9127-cdd83b1d9d7a","Type":"ContainerStarted","Data":"ed36b937f9febb22bb6432b30d60afa845e300338156552537aee2b8f4b8df24"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.186245 4675 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhnk6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.186576 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" podUID="45ca71c6-ab6e-4f92-ba2f-88096793d64b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.187370 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" podStartSLOduration=156.187354865 podStartE2EDuration="2m36.187354865s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:57.18544802 +0000 UTC m=+217.219077567" watchObservedRunningTime="2026-03-20 16:04:57.187354865 +0000 UTC m=+217.220984402" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.210038 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" podStartSLOduration=156.21002468 podStartE2EDuration="2m36.21002468s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:57.2093611 +0000 UTC m=+217.242990637" watchObservedRunningTime="2026-03-20 16:04:57.21002468 +0000 UTC m=+217.243654217" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.212208 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.213174 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.71316098 +0000 UTC m=+217.746790517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.219653 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" event={"ID":"6d96a004-baaa-4e15-af6b-e25b8e503958","Type":"ContainerStarted","Data":"1a213f628d75033dc8ffa429c454a495be48d8b34b6a1f7bf9d4ff11e9fc1976"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.220686 4675 patch_prober.go:28] interesting pod/console-operator-58897d9998-k6cqj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.220721 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" podUID="6d96a004-baaa-4e15-af6b-e25b8e503958" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.255094 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" event={"ID":"d8efdad2-e2fa-4003-bd03-117a399b9df0","Type":"ContainerStarted","Data":"77b16598f5cae4358d4c904110b709dcca9e9c9fadbaf8172973c11aeb50d1c4"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.273077 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-wld22" event={"ID":"c6d2332a-bd88-45d7-8645-63778001dd65","Type":"ContainerStarted","Data":"8f7c0f92776fc5e06ed3dec3038d8b84e8d9f4731a9777197d1cfb1195fbdaa3"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.286967 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" event={"ID":"15212a01-a933-41ec-96ad-d0fb79722f68","Type":"ContainerStarted","Data":"4ce2c1254b57273a0cca345cefd8ff9154d78f5c2a1ae7060b16a8746016e6ee"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.313703 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.316269 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.816254347 +0000 UTC m=+217.849883884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.317583 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q9jhd" event={"ID":"ba4d60b3-764c-4378-ba52-23f712ab9eb0","Type":"ContainerStarted","Data":"9677bb8ed3a3f8853dae299cc1a0b5698ca51260f9061b99e574c4939e04a068"} Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.350626 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.365063 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-q9jhd" podStartSLOduration=156.365045886 podStartE2EDuration="2m36.365045886s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:57.364536291 +0000 UTC m=+217.398165828" watchObservedRunningTime="2026-03-20 16:04:57.365045886 +0000 UTC m=+217.398675423" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.365261 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" podStartSLOduration=156.365257572 podStartE2EDuration="2m36.365257572s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:57.307910596 +0000 UTC m=+217.341540133" watchObservedRunningTime="2026-03-20 16:04:57.365257572 +0000 UTC m=+217.398887109" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.417665 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.417857 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.91782329 +0000 UTC m=+217.951452827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.418175 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.423707 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:57.923694659 +0000 UTC m=+217.957324196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.520971 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.521364 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.021337899 +0000 UTC m=+218.054967436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.623363 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.623688 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.123675284 +0000 UTC m=+218.157304821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.724371 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.724607 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.224582048 +0000 UTC m=+218.258211585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.724850 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.725175 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.225160144 +0000 UTC m=+218.258789671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.734941 4675 ???:1] "http: TLS handshake error from 192.168.126.11:59054: no serving certificate available for the kubelet" Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.825886 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.826215 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.326196862 +0000 UTC m=+218.359826399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:57 crc kubenswrapper[4675]: I0320 16:04:57.927605 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:57 crc kubenswrapper[4675]: E0320 16:04:57.927950 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.427938679 +0000 UTC m=+218.461568206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.028977 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.029355 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.529334687 +0000 UTC m=+218.562964224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.080100 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.086687 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:58 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:04:58 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:04:58 crc kubenswrapper[4675]: healthz check failed Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.086739 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.130363 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.130645 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.630632262 +0000 UTC m=+218.664261799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.231224 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.231508 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.731475144 +0000 UTC m=+218.765104691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.231901 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.232229 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.732215325 +0000 UTC m=+218.765844852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.332136 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" event={"ID":"d0dd2e63-5c15-409b-a537-4d92f9e5bd86","Type":"ContainerStarted","Data":"e2c20295d67b1a0d7456db273fc8f0404d175f7d225228aa0cc0ab85427b2b6f"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.332384 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.333011 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.832970745 +0000 UTC m=+218.866600332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.335187 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f8nq8" event={"ID":"9ec164c8-422d-443c-aed4-b30304c06694","Type":"ContainerStarted","Data":"286eac9435444b677843ef172cb4bd90a5566faf7440010a7f0e55355bb044ec"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.335220 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f8nq8" event={"ID":"9ec164c8-422d-443c-aed4-b30304c06694","Type":"ContainerStarted","Data":"7a271c346a7bb4a034ab02b8646d8772d817c2330fb69c4c4eabf742641f37f0"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.336063 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f8nq8" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.342814 4675 generic.go:334] "Generic (PLEG): container finished" podID="15212a01-a933-41ec-96ad-d0fb79722f68" containerID="f68c9f2045cc106050f8e37d0d5307e2d94fe5971ca6d21b0dc5469d07159f9f" exitCode=0 Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.342869 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" event={"ID":"15212a01-a933-41ec-96ad-d0fb79722f68","Type":"ContainerDied","Data":"f68c9f2045cc106050f8e37d0d5307e2d94fe5971ca6d21b0dc5469d07159f9f"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.345230 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" event={"ID":"6f6b57cd-8b1f-4d89-9ff3-a00efc202135","Type":"ContainerStarted","Data":"6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.345915 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.347186 4675 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5g6nl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.347236 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" podUID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.362531 4675 generic.go:334] "Generic (PLEG): container finished" podID="14695b26-4567-40f0-a892-25172bd0fb0a" containerID="7be04d3b349ff1023a2b87446a2fb654775f6efde1188d2926f9b93b94c88976" exitCode=0 Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.362629 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" event={"ID":"14695b26-4567-40f0-a892-25172bd0fb0a","Type":"ContainerDied","Data":"7be04d3b349ff1023a2b87446a2fb654775f6efde1188d2926f9b93b94c88976"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.374020 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f8nq8" podStartSLOduration=7.373999459 podStartE2EDuration="7.373999459s" podCreationTimestamp="2026-03-20 16:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.371638781 +0000 UTC m=+218.405268328" watchObservedRunningTime="2026-03-20 16:04:58.373999459 +0000 UTC m=+218.407628996" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.379631 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" event={"ID":"c311c63c-0f7e-4435-a2e3-fbc85a59594e","Type":"ContainerStarted","Data":"34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.380731 4675 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gnrqz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.380757 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.390069 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-699d9" event={"ID":"a8d888de-19f8-4e57-80f4-f3831177b6fd","Type":"ContainerStarted","Data":"67878854e66177cd4ba67680b330432ae386a5d0e0f04ab5675a778c6466e031"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.403489 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" event={"ID":"ffe62543-215b-47d8-9e48-de4466ce84f2","Type":"ContainerStarted","Data":"cd1fa8a3076546ed90bee8bff1c073246a3c3ed41a7d2eaea489dc70e435b890"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.409517 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" event={"ID":"585603c7-dfdf-4343-a32d-500c6868137e","Type":"ContainerStarted","Data":"7fe9c84b48148bef9fc597d291c2f6aca8927ffa1c62fa060cb54f73b793b0eb"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.409550 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" event={"ID":"585603c7-dfdf-4343-a32d-500c6868137e","Type":"ContainerStarted","Data":"fe5ba51dee42cb111d209392f1ebf74f4cb710356be4cee01d742c31b74292cd"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.412489 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" event={"ID":"45ca71c6-ab6e-4f92-ba2f-88096793d64b","Type":"ContainerStarted","Data":"cd3b9de1936ea33dff17538f55f190d4febdfe48abc29544dfb17c8f68214a3f"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.417521 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" event={"ID":"bbaf20e3-9148-4012-aa70-0c8ba76e02c2","Type":"ContainerStarted","Data":"f7242862c824d8041f32b90f330c6a0406a28d87fae1500e5808af96bd5c5ba2"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.417546 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" event={"ID":"bbaf20e3-9148-4012-aa70-0c8ba76e02c2","Type":"ContainerStarted","Data":"7c6bbbce86519779f2922d92113ce8ba3fd820fda5cb980b44de66eec03d2718"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.425931 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" event={"ID":"a30f4c73-2cdf-4fa4-b870-1bce73d3ceed","Type":"ContainerStarted","Data":"c0dc7ba4c97a0d1bc40d03935240a2767ee62385ebd769dacbb74b5f463acea8"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.435196 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" event={"ID":"20519e1c-a631-42ac-8bcb-c5a18b3ac4b0","Type":"ContainerStarted","Data":"115b0063f1eacc58fdee2ece67c009d03e871d440cbb1b88814691efaac86797"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.436140 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.437702 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:58.937687298 +0000 UTC m=+218.971316835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.461237 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhnk6" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.474168 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" event={"ID":"29b966ff-c9bd-42b4-bf25-6f942fc2bb4d","Type":"ContainerStarted","Data":"548511bd92f52f0fc8c413d2563ec6d58984c1230171243c772af6e8ac0df87e"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.474818 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.502966 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" podStartSLOduration=156.502947342 podStartE2EDuration="2m36.502947342s" podCreationTimestamp="2026-03-20 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.501783738 +0000 UTC m=+218.535413285" watchObservedRunningTime="2026-03-20 16:04:58.502947342 +0000 UTC m=+218.536576879" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.507618 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6cz9" event={"ID":"57069867-4c85-4262-ac54-05d5257ad81b","Type":"ContainerStarted","Data":"82b27400d8b42b7fcd5b900a024f37859d75ea26defccffb8cf7b21a317a5e35"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.509826 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cnrtx" event={"ID":"ccabe656-71a5-4e5b-b5f8-093e1b38f62c","Type":"ContainerStarted","Data":"d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.517946 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" event={"ID":"74e03833-c657-4de8-935a-fd7b8580d62b","Type":"ContainerStarted","Data":"3ae5045a9f4c8b6c53983404307720f3556a7f467e92af47e5a5bbaf6db24ba9"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.521690 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms29d" event={"ID":"6152960d-fbba-4874-9127-cdd83b1d9d7a","Type":"ContainerStarted","Data":"cc5075410435ecc42e440a88d1310d4939d740888a34f2b3b47b2ac4b4303ae4"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.538012 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.539316 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.039299611 +0000 UTC m=+219.072929148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.547424 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" event={"ID":"d8efdad2-e2fa-4003-bd03-117a399b9df0","Type":"ContainerStarted","Data":"5728eaf8d4808c58f55fde8649446661686192df74630fce6122b6a6cb1cfe77"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.547956 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9gz5w" event={"ID":"d8efdad2-e2fa-4003-bd03-117a399b9df0","Type":"ContainerStarted","Data":"7b1cbbf944d019eb6f602b5399227fbe2a640c2d00ed8b734cbbd281973e7629"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.556777 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" event={"ID":"1bebdfb7-34f5-4e90-b64e-c1442738c51d","Type":"ContainerStarted","Data":"fe96bb2d55cb617254c9b92c3570aff9f95f6a868e24704b5ada146bbc6ed581"} Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.556815 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.574240 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-ccrx9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.574352 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ccrx9" podUID="3074e872-f732-42c6-b7c3-6a88e0f5b81c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.599453 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-k6cqj" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.611750 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" podStartSLOduration=157.611714782 podStartE2EDuration="2m37.611714782s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.588331917 +0000 UTC m=+218.621961474" watchObservedRunningTime="2026-03-20 16:04:58.611714782 +0000 UTC m=+218.645344319" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.630972 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9tmzc" podStartSLOduration=157.630957628 podStartE2EDuration="2m37.630957628s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.628048604 +0000 UTC m=+218.661678151" watchObservedRunningTime="2026-03-20 16:04:58.630957628 +0000 UTC m=+218.664587165" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.641735 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.651214 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.151198592 +0000 UTC m=+219.184828319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.657089 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.742470 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vh65f" podStartSLOduration=156.742448547 podStartE2EDuration="2m36.742448547s" podCreationTimestamp="2026-03-20 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.685077551 +0000 UTC m=+218.718707088" watchObservedRunningTime="2026-03-20 16:04:58.742448547 +0000 UTC m=+218.776078084" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.745324 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.747203 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.247176094 +0000 UTC m=+219.280805631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.796458 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-699d9" podStartSLOduration=7.796442386 podStartE2EDuration="7.796442386s" podCreationTimestamp="2026-03-20 16:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.744655021 +0000 UTC m=+218.778284558" watchObservedRunningTime="2026-03-20 16:04:58.796442386 +0000 UTC m=+218.830071923" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.832960 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dxwrc" podStartSLOduration=157.83294559 podStartE2EDuration="2m37.83294559s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.832273091 +0000 UTC m=+218.865902628" watchObservedRunningTime="2026-03-20 16:04:58.83294559 +0000 UTC m=+218.866575127" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.841691 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6xn2m" podStartSLOduration=157.841672852 podStartE2EDuration="2m37.841672852s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.798055063 +0000 UTC m=+218.831684600" watchObservedRunningTime="2026-03-20 16:04:58.841672852 +0000 UTC m=+218.875302389" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.847781 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.848242 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.348223911 +0000 UTC m=+219.381853448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.898039 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gx27v" podStartSLOduration=156.898010999 podStartE2EDuration="2m36.898010999s" podCreationTimestamp="2026-03-20 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.895468956 +0000 UTC m=+218.929098493" watchObservedRunningTime="2026-03-20 16:04:58.898010999 +0000 UTC m=+218.931640536" Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.952096 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:58 crc kubenswrapper[4675]: E0320 16:04:58.952404 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.452382639 +0000 UTC m=+219.486012176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:58 crc kubenswrapper[4675]: I0320 16:04:58.983545 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-knk82" podStartSLOduration=157.983525208 podStartE2EDuration="2m37.983525208s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:58.944174452 +0000 UTC m=+218.977803999" watchObservedRunningTime="2026-03-20 16:04:58.983525208 +0000 UTC m=+219.017154745" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.011130 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-27f97" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.055508 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.055893 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.555877427 +0000 UTC m=+219.589506964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.074136 4675 ???:1] "http: TLS handshake error from 192.168.126.11:59064: no serving certificate available for the kubelet" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.088899 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:59 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:04:59 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:04:59 crc kubenswrapper[4675]: healthz check failed Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.088950 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.156492 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.156832 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.656814972 +0000 UTC m=+219.690444509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.260666 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.261358 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.76131917 +0000 UTC m=+219.794948707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.297987 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wgwkt" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.362952 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.363175 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.8631426 +0000 UTC m=+219.896772137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.363258 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.363647 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.863638824 +0000 UTC m=+219.897268581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.464421 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.464847 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:59.964805835 +0000 UTC m=+219.998435372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.563238 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" event={"ID":"15212a01-a933-41ec-96ad-d0fb79722f68","Type":"ContainerStarted","Data":"8db1a9fa419a2f31705b3a52575d8687dbbe6bf1a3531a0b233ad34e76fe720a"} Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.565838 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.566525 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.066493041 +0000 UTC m=+220.100122738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.575994 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" event={"ID":"14695b26-4567-40f0-a892-25172bd0fb0a","Type":"ContainerStarted","Data":"5450d96f32c00060b41b46de05e7278425941d1f6cf3903d118ada59720b4d8b"} Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.577590 4675 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gnrqz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.577646 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.593717 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.666749 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.680234 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.180177094 +0000 UTC m=+220.213806631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.772384 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.772669 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.272656814 +0000 UTC m=+220.306286351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.874460 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.874820 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.374804674 +0000 UTC m=+220.408434211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:59 crc kubenswrapper[4675]: I0320 16:04:59.976604 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:04:59 crc kubenswrapper[4675]: E0320 16:04:59.977047 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.477027125 +0000 UTC m=+220.510656662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.078672 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.078946 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.578921138 +0000 UTC m=+220.612550675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.079195 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.079494 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.579487824 +0000 UTC m=+220.613117351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.084752 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:00 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:00 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:00 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.084826 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.176714 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds6wn"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.176976 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" podUID="59253b32-b908-48ed-bfb6-d3374fbcd40b" containerName="controller-manager" containerID="cri-o://7de2f6e29598a5787e4df34a4fa04bc6565a724c7db13daf7d6afc5667745924" gracePeriod=30 Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.179902 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.180099 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.680064488 +0000 UTC m=+220.713694025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.180217 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.180522 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.680505491 +0000 UTC m=+220.714135028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.192600 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.281645 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.281981 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.781964851 +0000 UTC m=+220.815594388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.383027 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.383303 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.883292626 +0000 UTC m=+220.916922163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.412064 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwsgc"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.413083 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.414694 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.427178 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwsgc"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.484137 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.484347 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.984323194 +0000 UTC m=+221.017952731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.484603 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltc8d\" (UniqueName: \"kubernetes.io/projected/239989a6-b2d8-4061-88d4-a8a6a656fe6b-kube-api-access-ltc8d\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.484701 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-utilities\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.484808 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.484872 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-catalog-content\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.485303 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:00.985285191 +0000 UTC m=+221.018914728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.503913 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.504707 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.508969 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.509177 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.515870 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.586328 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.586529 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.086479433 +0000 UTC m=+221.120108970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.587051 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-catalog-content\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.587140 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.587179 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltc8d\" (UniqueName: \"kubernetes.io/projected/239989a6-b2d8-4061-88d4-a8a6a656fe6b-kube-api-access-ltc8d\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.587201 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.587236 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-utilities\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.587264 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.587798 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.087780761 +0000 UTC m=+221.121410298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.588554 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-utilities\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.588677 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-catalog-content\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.598025 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7n9r9"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.599078 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.602476 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.614033 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7n9r9"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.643275 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltc8d\" (UniqueName: \"kubernetes.io/projected/239989a6-b2d8-4061-88d4-a8a6a656fe6b-kube-api-access-ltc8d\") pod \"certified-operators-bwsgc\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.649743 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" event={"ID":"15212a01-a933-41ec-96ad-d0fb79722f68","Type":"ContainerStarted","Data":"a55a71f0e3f6fb12b872d552c5938700853ab4514467c1f3c609d9ab14f62992"} Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.655030 4675 generic.go:334] "Generic (PLEG): container finished" podID="59253b32-b908-48ed-bfb6-d3374fbcd40b" containerID="7de2f6e29598a5787e4df34a4fa04bc6565a724c7db13daf7d6afc5667745924" exitCode=0 Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.656098 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" event={"ID":"59253b32-b908-48ed-bfb6-d3374fbcd40b","Type":"ContainerDied","Data":"7de2f6e29598a5787e4df34a4fa04bc6565a724c7db13daf7d6afc5667745924"} Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.689510 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.689861 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-catalog-content\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.689935 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.689966 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.690011 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvc5m\" (UniqueName: \"kubernetes.io/projected/79c663f2-11ef-4c23-ac68-b8bb32997e77-kube-api-access-tvc5m\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.690066 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-utilities\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.690204 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.190179608 +0000 UTC m=+221.223809145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.690411 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.724696 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.730194 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.754601 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" podStartSLOduration=159.754585107 podStartE2EDuration="2m39.754585107s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:00.685314567 +0000 UTC m=+220.718944104" watchObservedRunningTime="2026-03-20 16:05:00.754585107 +0000 UTC m=+220.788214644" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.788019 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmc4p"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.792615 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.793633 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" podStartSLOduration=159.793616424 podStartE2EDuration="2m39.793616424s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:00.782560105 +0000 UTC m=+220.816189632" watchObservedRunningTime="2026-03-20 16:05:00.793616424 +0000 UTC m=+220.827245961" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.795544 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvc5m\" (UniqueName: \"kubernetes.io/projected/79c663f2-11ef-4c23-ac68-b8bb32997e77-kube-api-access-tvc5m\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.795589 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.795607 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-utilities\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.795682 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-catalog-content\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.797686 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.297675402 +0000 UTC m=+221.331304939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.798120 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-utilities\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.798598 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmc4p"] Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.801144 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-catalog-content\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.811432 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.824990 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.833320 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvc5m\" (UniqueName: \"kubernetes.io/projected/79c663f2-11ef-4c23-ac68-b8bb32997e77-kube-api-access-tvc5m\") pod \"community-operators-7n9r9\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.896521 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59253b32-b908-48ed-bfb6-d3374fbcd40b-serving-cert\") pod \"59253b32-b908-48ed-bfb6-d3374fbcd40b\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.896602 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-config\") pod \"59253b32-b908-48ed-bfb6-d3374fbcd40b\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.896633 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-984fz\" (UniqueName: \"kubernetes.io/projected/59253b32-b908-48ed-bfb6-d3374fbcd40b-kube-api-access-984fz\") pod \"59253b32-b908-48ed-bfb6-d3374fbcd40b\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.896854 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.896923 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-client-ca\") pod \"59253b32-b908-48ed-bfb6-d3374fbcd40b\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.897001 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-proxy-ca-bundles\") pod \"59253b32-b908-48ed-bfb6-d3374fbcd40b\" (UID: \"59253b32-b908-48ed-bfb6-d3374fbcd40b\") " Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.897150 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-catalog-content\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.897248 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqhln\" (UniqueName: \"kubernetes.io/projected/acb87e24-d219-4f7d-b28b-689cb6ccaa56-kube-api-access-nqhln\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.897277 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-utilities\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.898391 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-client-ca" (OuterVolumeSpecName: "client-ca") pod "59253b32-b908-48ed-bfb6-d3374fbcd40b" (UID: "59253b32-b908-48ed-bfb6-d3374fbcd40b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.898924 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-config" (OuterVolumeSpecName: "config") pod "59253b32-b908-48ed-bfb6-d3374fbcd40b" (UID: "59253b32-b908-48ed-bfb6-d3374fbcd40b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.900436 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "59253b32-b908-48ed-bfb6-d3374fbcd40b" (UID: "59253b32-b908-48ed-bfb6-d3374fbcd40b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.900506 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.40049037 +0000 UTC m=+221.434119907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.903967 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59253b32-b908-48ed-bfb6-d3374fbcd40b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59253b32-b908-48ed-bfb6-d3374fbcd40b" (UID: "59253b32-b908-48ed-bfb6-d3374fbcd40b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.904167 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59253b32-b908-48ed-bfb6-d3374fbcd40b-kube-api-access-984fz" (OuterVolumeSpecName: "kube-api-access-984fz") pod "59253b32-b908-48ed-bfb6-d3374fbcd40b" (UID: "59253b32-b908-48ed-bfb6-d3374fbcd40b"). InnerVolumeSpecName "kube-api-access-984fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.948309 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.993974 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b8vm6"] Mar 20 16:05:00 crc kubenswrapper[4675]: E0320 16:05:00.994190 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59253b32-b908-48ed-bfb6-d3374fbcd40b" containerName="controller-manager" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.994202 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="59253b32-b908-48ed-bfb6-d3374fbcd40b" containerName="controller-manager" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.994469 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="59253b32-b908-48ed-bfb6-d3374fbcd40b" containerName="controller-manager" Mar 20 16:05:00 crc kubenswrapper[4675]: I0320 16:05:00.995212 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:00.998215 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8vm6"] Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:00.999228 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqhln\" (UniqueName: \"kubernetes.io/projected/acb87e24-d219-4f7d-b28b-689cb6ccaa56-kube-api-access-nqhln\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:00.999525 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-utilities\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000015 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-utilities\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000062 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-catalog-content\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000114 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000249 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-catalog-content\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.000409 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.500396175 +0000 UTC m=+221.534025732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000536 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000605 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59253b32-b908-48ed-bfb6-d3374fbcd40b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000619 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000901 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-984fz\" (UniqueName: \"kubernetes.io/projected/59253b32-b908-48ed-bfb6-d3374fbcd40b-kube-api-access-984fz\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.000916 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59253b32-b908-48ed-bfb6-d3374fbcd40b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.053200 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqhln\" (UniqueName: \"kubernetes.io/projected/acb87e24-d219-4f7d-b28b-689cb6ccaa56-kube-api-access-nqhln\") pod \"certified-operators-cmc4p\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.102247 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.102539 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9n9\" (UniqueName: \"kubernetes.io/projected/abca8440-77fa-48b9-a977-9bba2e267728-kube-api-access-sw9n9\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.102599 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-utilities\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.102650 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-catalog-content\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.102914 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.602894215 +0000 UTC m=+221.636523762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.102955 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:01 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:01 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:01 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.102986 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.111113 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwsgc"] Mar 20 16:05:01 crc kubenswrapper[4675]: W0320 16:05:01.135548 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239989a6_b2d8_4061_88d4_a8a6a656fe6b.slice/crio-58b2ab3ecae9230ac0ea02a62cab9cfa9804736cd315d7fdf103f286d15e5a22 WatchSource:0}: Error finding container 58b2ab3ecae9230ac0ea02a62cab9cfa9804736cd315d7fdf103f286d15e5a22: Status 404 returned error can't find the container with id 58b2ab3ecae9230ac0ea02a62cab9cfa9804736cd315d7fdf103f286d15e5a22 Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.200178 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.204251 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-utilities\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.204316 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-catalog-content\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.204366 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.204385 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9n9\" (UniqueName: \"kubernetes.io/projected/abca8440-77fa-48b9-a977-9bba2e267728-kube-api-access-sw9n9\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.205038 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-catalog-content\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.205078 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.705062685 +0000 UTC m=+221.738692222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.205269 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-utilities\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.232599 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9n9\" (UniqueName: \"kubernetes.io/projected/abca8440-77fa-48b9-a977-9bba2e267728-kube-api-access-sw9n9\") pod \"community-operators-b8vm6\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.306324 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.306476 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.806451622 +0000 UTC m=+221.840081159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.306653 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.306964 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.806952227 +0000 UTC m=+221.840581764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.329061 4675 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.341597 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.377285 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7n9r9"] Mar 20 16:05:01 crc kubenswrapper[4675]: W0320 16:05:01.405966 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c663f2_11ef_4c23_ac68_b8bb32997e77.slice/crio-c25c8a92963b364260b28534c9ac2f7cb175f07c14af9b01eafcc5ce938cb09e WatchSource:0}: Error finding container c25c8a92963b364260b28534c9ac2f7cb175f07c14af9b01eafcc5ce938cb09e: Status 404 returned error can't find the container with id c25c8a92963b364260b28534c9ac2f7cb175f07c14af9b01eafcc5ce938cb09e Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.407148 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.407249 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.907229602 +0000 UTC m=+221.940859139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.407564 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.407864 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:01.90785584 +0000 UTC m=+221.941485377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.450446 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 16:05:01 crc kubenswrapper[4675]: W0320 16:05:01.471671 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6af38b7a_17a6_4cc6_bda9_70fd3f4de6b4.slice/crio-b8460ea0c04658b49e0b959172c2cdc82369078cc9b3d35f395da38af55cf7a0 WatchSource:0}: Error finding container b8460ea0c04658b49e0b959172c2cdc82369078cc9b3d35f395da38af55cf7a0: Status 404 returned error can't find the container with id b8460ea0c04658b49e0b959172c2cdc82369078cc9b3d35f395da38af55cf7a0 Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.511244 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.511527 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:02.011512313 +0000 UTC m=+222.045141850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.614401 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.614896 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:02.114880618 +0000 UTC m=+222.148510145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.619184 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmc4p"] Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.683580 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.684201 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.686164 4675 ???:1] "http: TLS handshake error from 192.168.126.11:41364: no serving certificate available for the kubelet" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.686542 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.686864 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.696916 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.717459 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.717990 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:05:02.217971365 +0000 UTC m=+222.251600912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.727121 4675 generic.go:334] "Generic (PLEG): container finished" podID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerID="634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950" exitCode=0 Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.727207 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7n9r9" event={"ID":"79c663f2-11ef-4c23-ac68-b8bb32997e77","Type":"ContainerDied","Data":"634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.727234 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7n9r9" event={"ID":"79c663f2-11ef-4c23-ac68-b8bb32997e77","Type":"ContainerStarted","Data":"c25c8a92963b364260b28534c9ac2f7cb175f07c14af9b01eafcc5ce938cb09e"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.741535 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8vm6"] Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.777800 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" event={"ID":"74e03833-c657-4de8-935a-fd7b8580d62b","Type":"ContainerStarted","Data":"55e1869a0943f59a3eb86c7fc45dbfac2eaa87d7ed71e2ea71254e1d68351d3c"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.778181 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" event={"ID":"74e03833-c657-4de8-935a-fd7b8580d62b","Type":"ContainerStarted","Data":"d4cc4d592204597865fa04f8f8c868a4300c20a8cbe398f840848a331913f78f"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.803044 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4","Type":"ContainerStarted","Data":"b8460ea0c04658b49e0b959172c2cdc82369078cc9b3d35f395da38af55cf7a0"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.811816 4675 generic.go:334] "Generic (PLEG): container finished" podID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerID="0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb" exitCode=0 Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.811901 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerDied","Data":"0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.811948 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerStarted","Data":"58b2ab3ecae9230ac0ea02a62cab9cfa9804736cd315d7fdf103f286d15e5a22"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.815644 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" podUID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" containerName="route-controller-manager" containerID="cri-o://6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346" gracePeriod=30 Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.815960 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.828981 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.829043 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9928977-d919-4ae9-b9ed-834d0f42043d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.829126 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9928977-d919-4ae9-b9ed-834d0f42043d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:01 crc kubenswrapper[4675]: E0320 16:05:01.831053 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:05:02.33103996 +0000 UTC m=+222.364669487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mz2tv" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.836079 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds6wn" event={"ID":"59253b32-b908-48ed-bfb6-d3374fbcd40b","Type":"ContainerDied","Data":"ce4b86ce13c74df730d52e4341928612607559be78ee1a107621b33fc75804f1"} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.836194 4675 scope.go:117] "RemoveContainer" containerID="7de2f6e29598a5787e4df34a4fa04bc6565a724c7db13daf7d6afc5667745924" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.908879 4675 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T16:05:01.329103166Z","Handler":null,"Name":""} Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.918281 4675 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.918617 4675 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.929912 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.930256 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9928977-d919-4ae9-b9ed-834d0f42043d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.930418 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9928977-d919-4ae9-b9ed-834d0f42043d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.930733 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9928977-d919-4ae9-b9ed-834d0f42043d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.955812 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 16:05:01 crc kubenswrapper[4675]: I0320 16:05:01.994053 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9928977-d919-4ae9-b9ed-834d0f42043d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.033384 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.037734 4675 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.038046 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.073090 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-998d9bd74-jskv5"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.073792 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.082989 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mz2tv\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.083380 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.083544 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.083623 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.083679 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.083726 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.084412 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.084663 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:02 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:02 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:02 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.084879 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.094341 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.098931 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.123735 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds6wn"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.135821 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-config\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.135884 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c980850b-711f-4780-842b-5e616b38b702-serving-cert\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.135913 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-client-ca\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.135933 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-proxy-ca-bundles\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.135956 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mmx\" (UniqueName: \"kubernetes.io/projected/c980850b-711f-4780-842b-5e616b38b702-kube-api-access-c4mmx\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.137201 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds6wn"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.145831 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-998d9bd74-jskv5"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.238357 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-config\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.238413 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c980850b-711f-4780-842b-5e616b38b702-serving-cert\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.238440 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-client-ca\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.238457 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-proxy-ca-bundles\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.238478 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mmx\" (UniqueName: \"kubernetes.io/projected/c980850b-711f-4780-842b-5e616b38b702-kube-api-access-c4mmx\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.240373 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-client-ca\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.240724 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-config\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.242853 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-proxy-ca-bundles\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.253939 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c980850b-711f-4780-842b-5e616b38b702-serving-cert\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.263973 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mmx\" (UniqueName: \"kubernetes.io/projected/c980850b-711f-4780-842b-5e616b38b702-kube-api-access-c4mmx\") pod \"controller-manager-998d9bd74-jskv5\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.283421 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.369562 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.425172 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.440804 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm7wr\" (UniqueName: \"kubernetes.io/projected/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-kube-api-access-xm7wr\") pod \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.440887 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-config\") pod \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.440995 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-client-ca\") pod \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.441024 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-serving-cert\") pod \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\" (UID: \"6f6b57cd-8b1f-4d89-9ff3-a00efc202135\") " Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.444585 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-config" (OuterVolumeSpecName: "config") pod "6f6b57cd-8b1f-4d89-9ff3-a00efc202135" (UID: "6f6b57cd-8b1f-4d89-9ff3-a00efc202135"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.444601 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f6b57cd-8b1f-4d89-9ff3-a00efc202135" (UID: "6f6b57cd-8b1f-4d89-9ff3-a00efc202135"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.445514 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f6b57cd-8b1f-4d89-9ff3-a00efc202135" (UID: "6f6b57cd-8b1f-4d89-9ff3-a00efc202135"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.451468 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-kube-api-access-xm7wr" (OuterVolumeSpecName: "kube-api-access-xm7wr") pod "6f6b57cd-8b1f-4d89-9ff3-a00efc202135" (UID: "6f6b57cd-8b1f-4d89-9ff3-a00efc202135"). InnerVolumeSpecName "kube-api-access-xm7wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.451683 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.546506 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.546911 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.546926 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm7wr\" (UniqueName: \"kubernetes.io/projected/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-kube-api-access-xm7wr\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.546937 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b57cd-8b1f-4d89-9ff3-a00efc202135-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.594758 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k8lpk"] Mar 20 16:05:02 crc kubenswrapper[4675]: E0320 16:05:02.594990 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" containerName="route-controller-manager" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.595001 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" containerName="route-controller-manager" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.598375 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" containerName="route-controller-manager" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.601880 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.605168 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8lpk"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.608139 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.693569 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59253b32-b908-48ed-bfb6-d3374fbcd40b" path="/var/lib/kubelet/pods/59253b32-b908-48ed-bfb6-d3374fbcd40b/volumes" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.694402 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.750200 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szdg\" (UniqueName: \"kubernetes.io/projected/4d82201b-fc6f-4776-87f3-7cf89822bda5-kube-api-access-8szdg\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.750249 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-utilities\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.750275 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-catalog-content\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.777231 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mz2tv"] Mar 20 16:05:02 crc kubenswrapper[4675]: W0320 16:05:02.816861 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b03bbc4_2ec2_4f58_bd4d_20770e1fb461.slice/crio-a08847d75860dc064cc57090f6de4f351c7d20245460f43839502b5ea2c0a60c WatchSource:0}: Error finding container a08847d75860dc064cc57090f6de4f351c7d20245460f43839502b5ea2c0a60c: Status 404 returned error can't find the container with id a08847d75860dc064cc57090f6de4f351c7d20245460f43839502b5ea2c0a60c Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.836371 4675 generic.go:334] "Generic (PLEG): container finished" podID="6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4" containerID="52522e9acee639e58fc8160d89311a34f363bf1f0393b150625305a093c6a989" exitCode=0 Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.836435 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4","Type":"ContainerDied","Data":"52522e9acee639e58fc8160d89311a34f363bf1f0393b150625305a093c6a989"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.841674 4675 generic.go:334] "Generic (PLEG): container finished" podID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" containerID="6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346" exitCode=0 Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.841743 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" event={"ID":"6f6b57cd-8b1f-4d89-9ff3-a00efc202135","Type":"ContainerDied","Data":"6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.841775 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.841795 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl" event={"ID":"6f6b57cd-8b1f-4d89-9ff3-a00efc202135","Type":"ContainerDied","Data":"2f11a176e50e61da7123cbb4c15aaab55e99d3aa958ac6b61e71c7e1b34aaa94"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.841815 4675 scope.go:117] "RemoveContainer" containerID="6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.854314 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szdg\" (UniqueName: \"kubernetes.io/projected/4d82201b-fc6f-4776-87f3-7cf89822bda5-kube-api-access-8szdg\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.854451 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-utilities\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.854509 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-catalog-content\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.855223 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-catalog-content\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.855845 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-utilities\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.870843 4675 generic.go:334] "Generic (PLEG): container finished" podID="9ca3b7f0-d36c-487e-938c-da2d8781061a" containerID="c8d81c49af6e574ff8e1c069754226c6d21a33d0af447a96f5b2c3c3b23d8944" exitCode=0 Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.871119 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" event={"ID":"9ca3b7f0-d36c-487e-938c-da2d8781061a","Type":"ContainerDied","Data":"c8d81c49af6e574ff8e1c069754226c6d21a33d0af447a96f5b2c3c3b23d8944"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.875240 4675 generic.go:334] "Generic (PLEG): container finished" podID="abca8440-77fa-48b9-a977-9bba2e267728" containerID="2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9" exitCode=0 Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.875334 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerDied","Data":"2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.875433 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerStarted","Data":"c62045a7eb5f427789ee6e0f932fd975272d29132df0528cf3741b6a9e895f44"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.880895 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szdg\" (UniqueName: \"kubernetes.io/projected/4d82201b-fc6f-4776-87f3-7cf89822bda5-kube-api-access-8szdg\") pod \"redhat-marketplace-k8lpk\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.887550 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9928977-d919-4ae9-b9ed-834d0f42043d","Type":"ContainerStarted","Data":"2f594d06eb6080ce2909dd8301de2da7334190d133b94e47d59a251d74feeaa2"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.904350 4675 generic.go:334] "Generic (PLEG): container finished" podID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerID="543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85" exitCode=0 Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.904388 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerDied","Data":"543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.904424 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerStarted","Data":"f267d036ee9d2c8b29228395201d091413f3be28a298af2e39692749b44b4937"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.913042 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" event={"ID":"74e03833-c657-4de8-935a-fd7b8580d62b","Type":"ContainerStarted","Data":"120ebbb8d3b7a673e5ccc449b6c6c600be0d96d49b7688602b1cec17ac7f2c50"} Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.913893 4675 scope.go:117] "RemoveContainer" containerID="6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346" Mar 20 16:05:02 crc kubenswrapper[4675]: E0320 16:05:02.929515 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346\": container with ID starting with 6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346 not found: ID does not exist" containerID="6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.929995 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346"} err="failed to get container status \"6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346\": rpc error: code = NotFound desc = could not find container \"6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346\": container with ID starting with 6a9081e0065312c7c521a63421afb354343b83b65800a42858f5d0beacb90346 not found: ID does not exist" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.968487 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.976479 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-qn4j5" podStartSLOduration=11.976463893 podStartE2EDuration="11.976463893s" podCreationTimestamp="2026-03-20 16:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:02.967901245 +0000 UTC m=+223.001530802" watchObservedRunningTime="2026-03-20 16:05:02.976463893 +0000 UTC m=+223.010093430" Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.976639 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-998d9bd74-jskv5"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.989978 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl"] Mar 20 16:05:02 crc kubenswrapper[4675]: I0320 16:05:02.994615 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5g6nl"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.011420 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4jq5j"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.013114 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.018123 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jq5j"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.083137 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:03 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:03 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:03 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.083194 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.158597 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxpfv\" (UniqueName: \"kubernetes.io/projected/e85ec396-6d81-4ad2-b269-315df42e61c4-kube-api-access-vxpfv\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.159093 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-utilities\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.159168 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-catalog-content\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.260920 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxpfv\" (UniqueName: \"kubernetes.io/projected/e85ec396-6d81-4ad2-b269-315df42e61c4-kube-api-access-vxpfv\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.261002 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-utilities\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.261035 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-catalog-content\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.261640 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-catalog-content\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.262377 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-utilities\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.279456 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxpfv\" (UniqueName: \"kubernetes.io/projected/e85ec396-6d81-4ad2-b269-315df42e61c4-kube-api-access-vxpfv\") pod \"redhat-marketplace-4jq5j\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.288844 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8lpk"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.348936 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.478388 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-ccrx9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.478439 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ccrx9" podUID="3074e872-f732-42c6-b7c3-6a88e0f5b81c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.478473 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-ccrx9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.478524 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ccrx9" podUID="3074e872-f732-42c6-b7c3-6a88e0f5b81c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.597599 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-blmfl"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.598974 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.601255 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.608716 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-blmfl"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.670956 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-utilities\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.671081 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbpv\" (UniqueName: \"kubernetes.io/projected/ac02846f-d933-4f76-9085-19a28023c633-kube-api-access-spbpv\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.671160 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-catalog-content\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.772639 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-catalog-content\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.772779 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-utilities\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.772845 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbpv\" (UniqueName: \"kubernetes.io/projected/ac02846f-d933-4f76-9085-19a28023c633-kube-api-access-spbpv\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.773380 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-catalog-content\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.773522 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-utilities\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.796054 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.796288 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.807652 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.828488 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbpv\" (UniqueName: \"kubernetes.io/projected/ac02846f-d933-4f76-9085-19a28023c633-kube-api-access-spbpv\") pod \"redhat-operators-blmfl\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.848410 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.868112 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jq5j"] Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.927661 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" event={"ID":"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461","Type":"ContainerStarted","Data":"669672e347272c8e516eebbf132a7f42388c9f8cdb077afe28f3e30743575b62"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.928060 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" event={"ID":"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461","Type":"ContainerStarted","Data":"a08847d75860dc064cc57090f6de4f351c7d20245460f43839502b5ea2c0a60c"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.928084 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.929199 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jq5j" event={"ID":"e85ec396-6d81-4ad2-b269-315df42e61c4","Type":"ContainerStarted","Data":"1a8814c23df2d3eb1cccc52cd6c3edbdbfd5d21ed7c4764640893f9f012e8d58"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.932471 4675 generic.go:334] "Generic (PLEG): container finished" podID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerID="ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc" exitCode=0 Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.932554 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerDied","Data":"ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.932632 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerStarted","Data":"dc0965d4a3bd1e1d3639e1beef1975329b5d90351313e41c22a6ef00ea2a833e"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.942595 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.944182 4675 generic.go:334] "Generic (PLEG): container finished" podID="f9928977-d919-4ae9-b9ed-834d0f42043d" containerID="863360fa248dbad8d814ba6968647ac5cea13f23ee738f1d7b5676ea8e358fbf" exitCode=0 Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.944245 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9928977-d919-4ae9-b9ed-834d0f42043d","Type":"ContainerDied","Data":"863360fa248dbad8d814ba6968647ac5cea13f23ee738f1d7b5676ea8e358fbf"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.951430 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" event={"ID":"c980850b-711f-4780-842b-5e616b38b702","Type":"ContainerStarted","Data":"b458384a77831deefbbd403d6f60df85369516681583c94758bc064669a5b894"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.951457 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" event={"ID":"c980850b-711f-4780-842b-5e616b38b702","Type":"ContainerStarted","Data":"8037478e7a857a090166834f635445f11bcafec671a7f51ed1bdad31856924e8"} Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.951473 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.957875 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6v2jn" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.959347 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:03 crc kubenswrapper[4675]: I0320 16:05:03.961504 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" podStartSLOduration=162.961486875 podStartE2EDuration="2m42.961486875s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:03.946444411 +0000 UTC m=+223.980073948" watchObservedRunningTime="2026-03-20 16:05:03.961486875 +0000 UTC m=+223.995116422" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.030843 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l5s99"] Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.032870 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.038716 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" podStartSLOduration=4.038695145 podStartE2EDuration="4.038695145s" podCreationTimestamp="2026-03-20 16:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:04.021084246 +0000 UTC m=+224.054713813" watchObservedRunningTime="2026-03-20 16:05:04.038695145 +0000 UTC m=+224.072324682" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.040726 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5s99"] Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.086454 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.093148 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:04 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:04 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:04 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.093202 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.111326 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq"] Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.130422 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.134012 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.134237 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.134400 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.134657 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.134813 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.134919 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.169157 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq"] Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.191501 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-utilities\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.192484 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-config\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.192742 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqv7x\" (UniqueName: \"kubernetes.io/projected/bdbc9640-d445-44d0-a797-0159a8d318a7-kube-api-access-fqv7x\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.192964 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbc9640-d445-44d0-a797-0159a8d318a7-serving-cert\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.193010 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-catalog-content\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.193166 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-client-ca\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.194111 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88p6\" (UniqueName: \"kubernetes.io/projected/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-kube-api-access-q88p6\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298170 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-client-ca\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298201 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88p6\" (UniqueName: \"kubernetes.io/projected/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-kube-api-access-q88p6\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298227 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-utilities\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298249 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-config\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298278 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqv7x\" (UniqueName: \"kubernetes.io/projected/bdbc9640-d445-44d0-a797-0159a8d318a7-kube-api-access-fqv7x\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298303 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbc9640-d445-44d0-a797-0159a8d318a7-serving-cert\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298317 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-catalog-content\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.298821 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-catalog-content\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.299288 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-utilities\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.299986 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-config\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.301535 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-client-ca\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.304587 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbc9640-d445-44d0-a797-0159a8d318a7-serving-cert\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.336792 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88p6\" (UniqueName: \"kubernetes.io/projected/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-kube-api-access-q88p6\") pod \"redhat-operators-l5s99\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.344833 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqv7x\" (UniqueName: \"kubernetes.io/projected/bdbc9640-d445-44d0-a797-0159a8d318a7-kube-api-access-fqv7x\") pod \"route-controller-manager-5c4d55c496-vhrgq\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.414143 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.414883 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.416214 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.420735 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.420793 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.425239 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.425287 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.446622 4675 patch_prober.go:28] interesting pod/console-f9d7485db-cnrtx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.446673 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cnrtx" podUID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.447245 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.447640 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.491413 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.504468 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume\") pod \"9ca3b7f0-d36c-487e-938c-da2d8781061a\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.504538 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume\") pod \"9ca3b7f0-d36c-487e-938c-da2d8781061a\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.504607 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdbhf\" (UniqueName: \"kubernetes.io/projected/9ca3b7f0-d36c-487e-938c-da2d8781061a-kube-api-access-xdbhf\") pod \"9ca3b7f0-d36c-487e-938c-da2d8781061a\" (UID: \"9ca3b7f0-d36c-487e-938c-da2d8781061a\") " Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.508632 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume" (OuterVolumeSpecName: "config-volume") pod "9ca3b7f0-d36c-487e-938c-da2d8781061a" (UID: "9ca3b7f0-d36c-487e-938c-da2d8781061a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.519926 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9ca3b7f0-d36c-487e-938c-da2d8781061a" (UID: "9ca3b7f0-d36c-487e-938c-da2d8781061a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.520116 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca3b7f0-d36c-487e-938c-da2d8781061a-kube-api-access-xdbhf" (OuterVolumeSpecName: "kube-api-access-xdbhf") pod "9ca3b7f0-d36c-487e-938c-da2d8781061a" (UID: "9ca3b7f0-d36c-487e-938c-da2d8781061a"). InnerVolumeSpecName "kube-api-access-xdbhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.527022 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.605800 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kube-api-access\") pod \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.607892 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kubelet-dir\") pod \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\" (UID: \"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4\") " Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.608149 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca3b7f0-d36c-487e-938c-da2d8781061a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.608159 4675 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ca3b7f0-d36c-487e-938c-da2d8781061a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.608650 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdbhf\" (UniqueName: \"kubernetes.io/projected/9ca3b7f0-d36c-487e-938c-da2d8781061a-kube-api-access-xdbhf\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.608697 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4" (UID: "6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.610224 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4" (UID: "6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.700950 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6b57cd-8b1f-4d89-9ff3-a00efc202135" path="/var/lib/kubelet/pods/6f6b57cd-8b1f-4d89-9ff3-a00efc202135/volumes" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.703328 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-blmfl"] Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.709440 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.709472 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.864135 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq"] Mar 20 16:05:04 crc kubenswrapper[4675]: W0320 16:05:04.885231 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbc9640_d445_44d0_a797_0159a8d318a7.slice/crio-c034fae2f5c99fbc42b24eb6d1781116b06bc06b157c8677db2c5ea3001badc4 WatchSource:0}: Error finding container c034fae2f5c99fbc42b24eb6d1781116b06bc06b157c8677db2c5ea3001badc4: Status 404 returned error can't find the container with id c034fae2f5c99fbc42b24eb6d1781116b06bc06b157c8677db2c5ea3001badc4 Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.917686 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5s99"] Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.949471 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.970466 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" event={"ID":"9ca3b7f0-d36c-487e-938c-da2d8781061a","Type":"ContainerDied","Data":"aa16fd362bd2b22fc7d1d68114a67fdd9a37181e4c2f3d9f0ec42275ba58ac75"} Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.970510 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa16fd362bd2b22fc7d1d68114a67fdd9a37181e4c2f3d9f0ec42275ba58ac75" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.970572 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-fncng" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.977341 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerStarted","Data":"092e6bdf0b4c1a6d485a0bb192d6667c6344e7e72ccada2b3181b94fc3e242ef"} Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.982470 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerStarted","Data":"0da9a7833586d5e50ace11f91c22665d360749d0c5e13d8f3865e2358a214a03"} Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.990183 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.990845 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4","Type":"ContainerDied","Data":"b8460ea0c04658b49e0b959172c2cdc82369078cc9b3d35f395da38af55cf7a0"} Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.990944 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8460ea0c04658b49e0b959172c2cdc82369078cc9b3d35f395da38af55cf7a0" Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.996130 4675 generic.go:334] "Generic (PLEG): container finished" podID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerID="47e353cc2637519311e8e9b67b7749e422bcdc10d726766c4ae780531e108d33" exitCode=0 Mar 20 16:05:04 crc kubenswrapper[4675]: I0320 16:05:04.996284 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jq5j" event={"ID":"e85ec396-6d81-4ad2-b269-315df42e61c4","Type":"ContainerDied","Data":"47e353cc2637519311e8e9b67b7749e422bcdc10d726766c4ae780531e108d33"} Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.014413 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qdwzs" Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.083224 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:05 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:05 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:05 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.083303 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.388148 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.422229 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9928977-d919-4ae9-b9ed-834d0f42043d-kubelet-dir\") pod \"f9928977-d919-4ae9-b9ed-834d0f42043d\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.422361 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9928977-d919-4ae9-b9ed-834d0f42043d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f9928977-d919-4ae9-b9ed-834d0f42043d" (UID: "f9928977-d919-4ae9-b9ed-834d0f42043d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.422458 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9928977-d919-4ae9-b9ed-834d0f42043d-kube-api-access\") pod \"f9928977-d919-4ae9-b9ed-834d0f42043d\" (UID: \"f9928977-d919-4ae9-b9ed-834d0f42043d\") " Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.422759 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9928977-d919-4ae9-b9ed-834d0f42043d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.428423 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9928977-d919-4ae9-b9ed-834d0f42043d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f9928977-d919-4ae9-b9ed-834d0f42043d" (UID: "f9928977-d919-4ae9-b9ed-834d0f42043d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:05 crc kubenswrapper[4675]: I0320 16:05:05.524287 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9928977-d919-4ae9-b9ed-834d0f42043d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.009628 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.009688 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9928977-d919-4ae9-b9ed-834d0f42043d","Type":"ContainerDied","Data":"2f594d06eb6080ce2909dd8301de2da7334190d133b94e47d59a251d74feeaa2"} Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.009754 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f594d06eb6080ce2909dd8301de2da7334190d133b94e47d59a251d74feeaa2" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.013677 4675 generic.go:334] "Generic (PLEG): container finished" podID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerID="57c749916a253ec6fa2fe5604f71cefa04578b75c4e4028cfa80a5453b5b4a2f" exitCode=0 Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.013806 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerDied","Data":"57c749916a253ec6fa2fe5604f71cefa04578b75c4e4028cfa80a5453b5b4a2f"} Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.019164 4675 generic.go:334] "Generic (PLEG): container finished" podID="ac02846f-d933-4f76-9085-19a28023c633" containerID="165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516" exitCode=0 Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.019261 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerDied","Data":"165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516"} Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.028224 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" event={"ID":"bdbc9640-d445-44d0-a797-0159a8d318a7","Type":"ContainerStarted","Data":"f9849d2ef068b97ee6c3e52811ecad8ad68dbd5e6cb6da523c394f80c0ae25b3"} Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.028268 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.028290 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" event={"ID":"bdbc9640-d445-44d0-a797-0159a8d318a7","Type":"ContainerStarted","Data":"c034fae2f5c99fbc42b24eb6d1781116b06bc06b157c8677db2c5ea3001badc4"} Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.033474 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f8nq8" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.035432 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.050920 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" podStartSLOduration=6.050903606 podStartE2EDuration="6.050903606s" podCreationTimestamp="2026-03-20 16:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:06.050066762 +0000 UTC m=+226.083696299" watchObservedRunningTime="2026-03-20 16:05:06.050903606 +0000 UTC m=+226.084533133" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.084657 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:06 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:06 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:06 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.084703 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:06 crc kubenswrapper[4675]: I0320 16:05:06.835876 4675 ???:1] "http: TLS handshake error from 192.168.126.11:41378: no serving certificate available for the kubelet" Mar 20 16:05:07 crc kubenswrapper[4675]: I0320 16:05:07.082830 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:07 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:07 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:07 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:07 crc kubenswrapper[4675]: I0320 16:05:07.082897 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:07 crc kubenswrapper[4675]: I0320 16:05:07.281207 4675 ???:1] "http: TLS handshake error from 192.168.126.11:41392: no serving certificate available for the kubelet" Mar 20 16:05:08 crc kubenswrapper[4675]: I0320 16:05:08.082999 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:08 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:08 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:08 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:08 crc kubenswrapper[4675]: I0320 16:05:08.083358 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:09 crc kubenswrapper[4675]: I0320 16:05:09.084409 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:09 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:09 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:09 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:09 crc kubenswrapper[4675]: I0320 16:05:09.085054 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:10 crc kubenswrapper[4675]: I0320 16:05:10.082191 4675 patch_prober.go:28] interesting pod/router-default-5444994796-q9jhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:05:10 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Mar 20 16:05:10 crc kubenswrapper[4675]: [+]process-running ok Mar 20 16:05:10 crc kubenswrapper[4675]: healthz check failed Mar 20 16:05:10 crc kubenswrapper[4675]: I0320 16:05:10.082246 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q9jhd" podUID="ba4d60b3-764c-4378-ba52-23f712ab9eb0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:05:11 crc kubenswrapper[4675]: I0320 16:05:11.083024 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:05:11 crc kubenswrapper[4675]: I0320 16:05:11.085754 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-q9jhd" Mar 20 16:05:13 crc kubenswrapper[4675]: I0320 16:05:13.493463 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ccrx9" Mar 20 16:05:14 crc kubenswrapper[4675]: I0320 16:05:14.424973 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:05:14 crc kubenswrapper[4675]: I0320 16:05:14.430243 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:05:17 crc kubenswrapper[4675]: I0320 16:05:17.102544 4675 ???:1] "http: TLS handshake error from 192.168.126.11:57492: no serving certificate available for the kubelet" Mar 20 16:05:18 crc kubenswrapper[4675]: I0320 16:05:18.008636 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:05:19 crc kubenswrapper[4675]: I0320 16:05:19.892034 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-998d9bd74-jskv5"] Mar 20 16:05:19 crc kubenswrapper[4675]: I0320 16:05:19.892305 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" podUID="c980850b-711f-4780-842b-5e616b38b702" containerName="controller-manager" containerID="cri-o://b458384a77831deefbbd403d6f60df85369516681583c94758bc064669a5b894" gracePeriod=30 Mar 20 16:05:19 crc kubenswrapper[4675]: I0320 16:05:19.903413 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq"] Mar 20 16:05:19 crc kubenswrapper[4675]: I0320 16:05:19.903601 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" podUID="bdbc9640-d445-44d0-a797-0159a8d318a7" containerName="route-controller-manager" containerID="cri-o://f9849d2ef068b97ee6c3e52811ecad8ad68dbd5e6cb6da523c394f80c0ae25b3" gracePeriod=30 Mar 20 16:05:20 crc kubenswrapper[4675]: I0320 16:05:20.112305 4675 generic.go:334] "Generic (PLEG): container finished" podID="c980850b-711f-4780-842b-5e616b38b702" containerID="b458384a77831deefbbd403d6f60df85369516681583c94758bc064669a5b894" exitCode=0 Mar 20 16:05:20 crc kubenswrapper[4675]: I0320 16:05:20.112403 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" event={"ID":"c980850b-711f-4780-842b-5e616b38b702","Type":"ContainerDied","Data":"b458384a77831deefbbd403d6f60df85369516681583c94758bc064669a5b894"} Mar 20 16:05:20 crc kubenswrapper[4675]: I0320 16:05:20.114580 4675 generic.go:334] "Generic (PLEG): container finished" podID="bdbc9640-d445-44d0-a797-0159a8d318a7" containerID="f9849d2ef068b97ee6c3e52811ecad8ad68dbd5e6cb6da523c394f80c0ae25b3" exitCode=0 Mar 20 16:05:20 crc kubenswrapper[4675]: I0320 16:05:20.114640 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" event={"ID":"bdbc9640-d445-44d0-a797-0159a8d318a7","Type":"ContainerDied","Data":"f9849d2ef068b97ee6c3e52811ecad8ad68dbd5e6cb6da523c394f80c0ae25b3"} Mar 20 16:05:22 crc kubenswrapper[4675]: I0320 16:05:22.298352 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:05:22 crc kubenswrapper[4675]: I0320 16:05:22.426683 4675 patch_prober.go:28] interesting pod/controller-manager-998d9bd74-jskv5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 16:05:22 crc kubenswrapper[4675]: I0320 16:05:22.426809 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" podUID="c980850b-711f-4780-842b-5e616b38b702" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 16:05:22 crc kubenswrapper[4675]: E0320 16:05:22.636389 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 16:05:22 crc kubenswrapper[4675]: E0320 16:05:22.636542 4675 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:05:22 crc kubenswrapper[4675]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 16:05:22 crc kubenswrapper[4675]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xv82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567044-wld22_openshift-infra(c6d2332a-bd88-45d7-8645-63778001dd65): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 16:05:22 crc kubenswrapper[4675]: > logger="UnhandledError" Mar 20 16:05:22 crc kubenswrapper[4675]: E0320 16:05:22.639025 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567044-wld22" podUID="c6d2332a-bd88-45d7-8645-63778001dd65" Mar 20 16:05:23 crc kubenswrapper[4675]: E0320 16:05:23.136066 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567044-wld22" podUID="c6d2332a-bd88-45d7-8645-63778001dd65" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.215703 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.218568 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.234244 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfd7e79e-d566-4cfc-80b0-b8ff3a489837-metrics-certs\") pod \"network-metrics-daemon-mrjmp\" (UID: \"dfd7e79e-d566-4cfc-80b0-b8ff3a489837\") " pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.528988 4675 patch_prober.go:28] interesting pod/route-controller-manager-5c4d55c496-vhrgq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.529052 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" podUID="bdbc9640-d445-44d0-a797-0159a8d318a7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.535965 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.544691 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mrjmp" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.832575 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.838872 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863190 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q"] Mar 20 16:05:25 crc kubenswrapper[4675]: E0320 16:05:25.863436 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c980850b-711f-4780-842b-5e616b38b702" containerName="controller-manager" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863455 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c980850b-711f-4780-842b-5e616b38b702" containerName="controller-manager" Mar 20 16:05:25 crc kubenswrapper[4675]: E0320 16:05:25.863467 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbc9640-d445-44d0-a797-0159a8d318a7" containerName="route-controller-manager" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863475 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbc9640-d445-44d0-a797-0159a8d318a7" containerName="route-controller-manager" Mar 20 16:05:25 crc kubenswrapper[4675]: E0320 16:05:25.863483 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca3b7f0-d36c-487e-938c-da2d8781061a" containerName="collect-profiles" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863491 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca3b7f0-d36c-487e-938c-da2d8781061a" containerName="collect-profiles" Mar 20 16:05:25 crc kubenswrapper[4675]: E0320 16:05:25.863503 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9928977-d919-4ae9-b9ed-834d0f42043d" containerName="pruner" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863510 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9928977-d919-4ae9-b9ed-834d0f42043d" containerName="pruner" Mar 20 16:05:25 crc kubenswrapper[4675]: E0320 16:05:25.863521 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4" containerName="pruner" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863528 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4" containerName="pruner" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863648 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c980850b-711f-4780-842b-5e616b38b702" containerName="controller-manager" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863664 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca3b7f0-d36c-487e-938c-da2d8781061a" containerName="collect-profiles" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863674 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9928977-d919-4ae9-b9ed-834d0f42043d" containerName="pruner" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863682 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdbc9640-d445-44d0-a797-0159a8d318a7" containerName="route-controller-manager" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.863694 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af38b7a-17a6-4cc6-bda9-70fd3f4de6b4" containerName="pruner" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.864123 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.918198 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q"] Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.924118 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c980850b-711f-4780-842b-5e616b38b702-serving-cert\") pod \"c980850b-711f-4780-842b-5e616b38b702\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.924221 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58fb26f6-10aa-4518-83fc-a502cf0f1f98-serving-cert\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.924252 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-config\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.924277 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-client-ca\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.924304 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qgk\" (UniqueName: \"kubernetes.io/projected/58fb26f6-10aa-4518-83fc-a502cf0f1f98-kube-api-access-h9qgk\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:25 crc kubenswrapper[4675]: I0320 16:05:25.934586 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c980850b-711f-4780-842b-5e616b38b702-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c980850b-711f-4780-842b-5e616b38b702" (UID: "c980850b-711f-4780-842b-5e616b38b702"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.024928 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4mmx\" (UniqueName: \"kubernetes.io/projected/c980850b-711f-4780-842b-5e616b38b702-kube-api-access-c4mmx\") pod \"c980850b-711f-4780-842b-5e616b38b702\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025026 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbc9640-d445-44d0-a797-0159a8d318a7-serving-cert\") pod \"bdbc9640-d445-44d0-a797-0159a8d318a7\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025088 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-proxy-ca-bundles\") pod \"c980850b-711f-4780-842b-5e616b38b702\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025151 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-config\") pod \"c980850b-711f-4780-842b-5e616b38b702\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025191 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqv7x\" (UniqueName: \"kubernetes.io/projected/bdbc9640-d445-44d0-a797-0159a8d318a7-kube-api-access-fqv7x\") pod \"bdbc9640-d445-44d0-a797-0159a8d318a7\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025221 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-client-ca\") pod \"bdbc9640-d445-44d0-a797-0159a8d318a7\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025292 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-config\") pod \"bdbc9640-d445-44d0-a797-0159a8d318a7\" (UID: \"bdbc9640-d445-44d0-a797-0159a8d318a7\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025315 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-client-ca\") pod \"c980850b-711f-4780-842b-5e616b38b702\" (UID: \"c980850b-711f-4780-842b-5e616b38b702\") " Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025524 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58fb26f6-10aa-4518-83fc-a502cf0f1f98-serving-cert\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025566 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-config\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025601 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-client-ca\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025644 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qgk\" (UniqueName: \"kubernetes.io/projected/58fb26f6-10aa-4518-83fc-a502cf0f1f98-kube-api-access-h9qgk\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.025700 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c980850b-711f-4780-842b-5e616b38b702-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.027986 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c980850b-711f-4780-842b-5e616b38b702" (UID: "c980850b-711f-4780-842b-5e616b38b702"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.028100 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "bdbc9640-d445-44d0-a797-0159a8d318a7" (UID: "bdbc9640-d445-44d0-a797-0159a8d318a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.028162 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-config" (OuterVolumeSpecName: "config") pod "bdbc9640-d445-44d0-a797-0159a8d318a7" (UID: "bdbc9640-d445-44d0-a797-0159a8d318a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.028554 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-client-ca" (OuterVolumeSpecName: "client-ca") pod "c980850b-711f-4780-842b-5e616b38b702" (UID: "c980850b-711f-4780-842b-5e616b38b702"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.028593 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-config" (OuterVolumeSpecName: "config") pod "c980850b-711f-4780-842b-5e616b38b702" (UID: "c980850b-711f-4780-842b-5e616b38b702"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.029173 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-client-ca\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.029453 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdbc9640-d445-44d0-a797-0159a8d318a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bdbc9640-d445-44d0-a797-0159a8d318a7" (UID: "bdbc9640-d445-44d0-a797-0159a8d318a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.030154 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-config\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.030638 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbc9640-d445-44d0-a797-0159a8d318a7-kube-api-access-fqv7x" (OuterVolumeSpecName: "kube-api-access-fqv7x") pod "bdbc9640-d445-44d0-a797-0159a8d318a7" (UID: "bdbc9640-d445-44d0-a797-0159a8d318a7"). InnerVolumeSpecName "kube-api-access-fqv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.031014 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58fb26f6-10aa-4518-83fc-a502cf0f1f98-serving-cert\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.031303 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c980850b-711f-4780-842b-5e616b38b702-kube-api-access-c4mmx" (OuterVolumeSpecName: "kube-api-access-c4mmx") pod "c980850b-711f-4780-842b-5e616b38b702" (UID: "c980850b-711f-4780-842b-5e616b38b702"). InnerVolumeSpecName "kube-api-access-c4mmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.048927 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qgk\" (UniqueName: \"kubernetes.io/projected/58fb26f6-10aa-4518-83fc-a502cf0f1f98-kube-api-access-h9qgk\") pod \"route-controller-manager-5cc498856-bbq6q\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126539 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126582 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126593 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4mmx\" (UniqueName: \"kubernetes.io/projected/c980850b-711f-4780-842b-5e616b38b702-kube-api-access-c4mmx\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126603 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbc9640-d445-44d0-a797-0159a8d318a7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126612 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126620 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980850b-711f-4780-842b-5e616b38b702-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126629 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqv7x\" (UniqueName: \"kubernetes.io/projected/bdbc9640-d445-44d0-a797-0159a8d318a7-kube-api-access-fqv7x\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.126637 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdbc9640-d445-44d0-a797-0159a8d318a7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.161985 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.162168 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq" event={"ID":"bdbc9640-d445-44d0-a797-0159a8d318a7","Type":"ContainerDied","Data":"c034fae2f5c99fbc42b24eb6d1781116b06bc06b157c8677db2c5ea3001badc4"} Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.162276 4675 scope.go:117] "RemoveContainer" containerID="f9849d2ef068b97ee6c3e52811ecad8ad68dbd5e6cb6da523c394f80c0ae25b3" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.167277 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" event={"ID":"c980850b-711f-4780-842b-5e616b38b702","Type":"ContainerDied","Data":"8037478e7a857a090166834f635445f11bcafec671a7f51ed1bdad31856924e8"} Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.167411 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-998d9bd74-jskv5" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.195972 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq"] Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.202084 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4d55c496-vhrgq"] Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.208224 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-998d9bd74-jskv5"] Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.210178 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.211189 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-998d9bd74-jskv5"] Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.681560 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbc9640-d445-44d0-a797-0159a8d318a7" path="/var/lib/kubelet/pods/bdbc9640-d445-44d0-a797-0159a8d318a7/volumes" Mar 20 16:05:26 crc kubenswrapper[4675]: I0320 16:05:26.682414 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c980850b-711f-4780-842b-5e616b38b702" path="/var/lib/kubelet/pods/c980850b-711f-4780-842b-5e616b38b702/volumes" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.095049 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5958c5996b-lx9q2"] Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.096472 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.102005 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.104093 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.104113 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.104261 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.104379 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.104756 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.108526 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5958c5996b-lx9q2"] Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.108983 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.152905 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-serving-cert\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.152970 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlv4v\" (UniqueName: \"kubernetes.io/projected/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-kube-api-access-tlv4v\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.152997 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-client-ca\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.153152 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-proxy-ca-bundles\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.153309 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-config\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.253954 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-config\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.254008 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-serving-cert\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.254031 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlv4v\" (UniqueName: \"kubernetes.io/projected/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-kube-api-access-tlv4v\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.254050 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-client-ca\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.254091 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-proxy-ca-bundles\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.255298 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-client-ca\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.255400 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-config\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.256015 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-proxy-ca-bundles\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.261676 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-serving-cert\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.271960 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlv4v\" (UniqueName: \"kubernetes.io/projected/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-kube-api-access-tlv4v\") pod \"controller-manager-5958c5996b-lx9q2\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: I0320 16:05:28.420270 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:28 crc kubenswrapper[4675]: E0320 16:05:28.925185 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 16:05:28 crc kubenswrapper[4675]: E0320 16:05:28.925367 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sw9n9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b8vm6_openshift-marketplace(abca8440-77fa-48b9-a977-9bba2e267728): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:28 crc kubenswrapper[4675]: E0320 16:05:28.926634 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b8vm6" podUID="abca8440-77fa-48b9-a977-9bba2e267728" Mar 20 16:05:32 crc kubenswrapper[4675]: E0320 16:05:32.771850 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b8vm6" podUID="abca8440-77fa-48b9-a977-9bba2e267728" Mar 20 16:05:33 crc kubenswrapper[4675]: I0320 16:05:33.950624 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dd98v" Mar 20 16:05:34 crc kubenswrapper[4675]: I0320 16:05:34.424793 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:05:34 crc kubenswrapper[4675]: I0320 16:05:34.424851 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.607510 4675 scope.go:117] "RemoveContainer" containerID="b458384a77831deefbbd403d6f60df85369516681583c94758bc064669a5b894" Mar 20 16:05:35 crc kubenswrapper[4675]: E0320 16:05:35.642718 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 16:05:35 crc kubenswrapper[4675]: E0320 16:05:35.642944 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8szdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k8lpk_openshift-marketplace(4d82201b-fc6f-4776-87f3-7cf89822bda5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:35 crc kubenswrapper[4675]: E0320 16:05:35.644213 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k8lpk" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" Mar 20 16:05:35 crc kubenswrapper[4675]: E0320 16:05:35.907594 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 16:05:35 crc kubenswrapper[4675]: E0320 16:05:35.907943 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vxpfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4jq5j_openshift-marketplace(e85ec396-6d81-4ad2-b269-315df42e61c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.910168 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.911037 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:35 crc kubenswrapper[4675]: E0320 16:05:35.911551 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4jq5j" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.913015 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.915361 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.916222 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.958740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.958852 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:35 crc kubenswrapper[4675]: I0320 16:05:35.997657 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mrjmp"] Mar 20 16:05:36 crc kubenswrapper[4675]: W0320 16:05:36.010991 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd7e79e_d566_4cfc_80b0_b8ff3a489837.slice/crio-e7c51cad5209e5c947aedb9fc97ca91f83790d9dc0db5ccb05e86b94086800ee WatchSource:0}: Error finding container e7c51cad5209e5c947aedb9fc97ca91f83790d9dc0db5ccb05e86b94086800ee: Status 404 returned error can't find the container with id e7c51cad5209e5c947aedb9fc97ca91f83790d9dc0db5ccb05e86b94086800ee Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.061535 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.061589 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.061667 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.077904 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q"] Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.082083 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:36 crc kubenswrapper[4675]: W0320 16:05:36.086231 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58fb26f6_10aa_4518_83fc_a502cf0f1f98.slice/crio-99a61ea53a480abee1e759967bf4a96a8959eedf9530751e266c469583dcb264 WatchSource:0}: Error finding container 99a61ea53a480abee1e759967bf4a96a8959eedf9530751e266c469583dcb264: Status 404 returned error can't find the container with id 99a61ea53a480abee1e759967bf4a96a8959eedf9530751e266c469583dcb264 Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.183907 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5958c5996b-lx9q2"] Mar 20 16:05:36 crc kubenswrapper[4675]: W0320 16:05:36.214200 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289e0ec8_f0e1_42f2_bc63_0f970c1b188a.slice/crio-e718bde1d120b7aa5cd1650c3fe9ebe179419adaa31610827dcf31e642f66eb1 WatchSource:0}: Error finding container e718bde1d120b7aa5cd1650c3fe9ebe179419adaa31610827dcf31e642f66eb1: Status 404 returned error can't find the container with id e718bde1d120b7aa5cd1650c3fe9ebe179419adaa31610827dcf31e642f66eb1 Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.229964 4675 generic.go:334] "Generic (PLEG): container finished" podID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerID="16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273" exitCode=0 Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.230030 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7n9r9" event={"ID":"79c663f2-11ef-4c23-ac68-b8bb32997e77","Type":"ContainerDied","Data":"16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.232365 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerStarted","Data":"35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.238378 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" event={"ID":"289e0ec8-f0e1-42f2-bc63-0f970c1b188a","Type":"ContainerStarted","Data":"e718bde1d120b7aa5cd1650c3fe9ebe179419adaa31610827dcf31e642f66eb1"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.240461 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerStarted","Data":"598994fcfb2f44148b5dfc8623c92a129b6fb7af9df790147623469adc40325d"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.243302 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerStarted","Data":"3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.247741 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" event={"ID":"58fb26f6-10aa-4518-83fc-a502cf0f1f98","Type":"ContainerStarted","Data":"99a61ea53a480abee1e759967bf4a96a8959eedf9530751e266c469583dcb264"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.248150 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.263823 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerStarted","Data":"6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2"} Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.271811 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" event={"ID":"dfd7e79e-d566-4cfc-80b0-b8ff3a489837","Type":"ContainerStarted","Data":"e7c51cad5209e5c947aedb9fc97ca91f83790d9dc0db5ccb05e86b94086800ee"} Mar 20 16:05:36 crc kubenswrapper[4675]: E0320 16:05:36.275830 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k8lpk" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" Mar 20 16:05:36 crc kubenswrapper[4675]: E0320 16:05:36.278303 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4jq5j" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" Mar 20 16:05:36 crc kubenswrapper[4675]: I0320 16:05:36.508022 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 16:05:36 crc kubenswrapper[4675]: W0320 16:05:36.515235 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod91e74f32_a9bd_4a15_9534_b9b9f31888a2.slice/crio-7a79387c2d40d6fbe79a7d8cab0802fdb31f52ac135bc91cb8aad27e2082d14d WatchSource:0}: Error finding container 7a79387c2d40d6fbe79a7d8cab0802fdb31f52ac135bc91cb8aad27e2082d14d: Status 404 returned error can't find the container with id 7a79387c2d40d6fbe79a7d8cab0802fdb31f52ac135bc91cb8aad27e2082d14d Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.277080 4675 generic.go:334] "Generic (PLEG): container finished" podID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerID="598994fcfb2f44148b5dfc8623c92a129b6fb7af9df790147623469adc40325d" exitCode=0 Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.277380 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerDied","Data":"598994fcfb2f44148b5dfc8623c92a129b6fb7af9df790147623469adc40325d"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.280372 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" event={"ID":"dfd7e79e-d566-4cfc-80b0-b8ff3a489837","Type":"ContainerStarted","Data":"e9d4ebbe5483edb330dce8731a314747002dce0e1afccddfe5938b30eed413ed"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.280406 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mrjmp" event={"ID":"dfd7e79e-d566-4cfc-80b0-b8ff3a489837","Type":"ContainerStarted","Data":"797c63bcf5cd47af32394e11e0aec477257ea81ccb2f3d50ae2966342f8ebc99"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.282052 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" event={"ID":"58fb26f6-10aa-4518-83fc-a502cf0f1f98","Type":"ContainerStarted","Data":"6bbfd1f7cbfa0b452929267ed196af6d88d2be21cdcf6347fae4831ba7992ba5"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.282232 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.283944 4675 generic.go:334] "Generic (PLEG): container finished" podID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerID="3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457" exitCode=0 Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.284012 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerDied","Data":"3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.285547 4675 generic.go:334] "Generic (PLEG): container finished" podID="ac02846f-d933-4f76-9085-19a28023c633" containerID="35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748" exitCode=0 Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.285597 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerDied","Data":"35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.286828 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"91e74f32-a9bd-4a15-9534-b9b9f31888a2","Type":"ContainerStarted","Data":"a730eb693d40aa4df94ec8ab603485ce5c4853a6a53fa0187ad2a3466ef5937c"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.286852 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"91e74f32-a9bd-4a15-9534-b9b9f31888a2","Type":"ContainerStarted","Data":"7a79387c2d40d6fbe79a7d8cab0802fdb31f52ac135bc91cb8aad27e2082d14d"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.288656 4675 generic.go:334] "Generic (PLEG): container finished" podID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerID="6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2" exitCode=0 Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.288726 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerDied","Data":"6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.290549 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" event={"ID":"289e0ec8-f0e1-42f2-bc63-0f970c1b188a","Type":"ContainerStarted","Data":"d7ab5b7ec34253ea2411a4a0a4f8474dec8b999e0dd5f9cbb8848085d6ead747"} Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.290754 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.294538 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.295525 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.310372 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.31035425 podStartE2EDuration="2.31035425s" podCreationTimestamp="2026-03-20 16:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:37.30619216 +0000 UTC m=+257.339821697" watchObservedRunningTime="2026-03-20 16:05:37.31035425 +0000 UTC m=+257.343983797" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.330375 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mrjmp" podStartSLOduration=196.330356438 podStartE2EDuration="3m16.330356438s" podCreationTimestamp="2026-03-20 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:37.326860857 +0000 UTC m=+257.360490394" watchObservedRunningTime="2026-03-20 16:05:37.330356438 +0000 UTC m=+257.363985975" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.402476 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" podStartSLOduration=18.4024584 podStartE2EDuration="18.4024584s" podCreationTimestamp="2026-03-20 16:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:37.400249886 +0000 UTC m=+257.433879423" watchObservedRunningTime="2026-03-20 16:05:37.4024584 +0000 UTC m=+257.436087937" Mar 20 16:05:37 crc kubenswrapper[4675]: I0320 16:05:37.448843 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" podStartSLOduration=18.448827078 podStartE2EDuration="18.448827078s" podCreationTimestamp="2026-03-20 16:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:37.448173199 +0000 UTC m=+257.481802736" watchObservedRunningTime="2026-03-20 16:05:37.448827078 +0000 UTC m=+257.482456615" Mar 20 16:05:38 crc kubenswrapper[4675]: I0320 16:05:38.298923 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7n9r9" event={"ID":"79c663f2-11ef-4c23-ac68-b8bb32997e77","Type":"ContainerStarted","Data":"ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12"} Mar 20 16:05:38 crc kubenswrapper[4675]: I0320 16:05:38.300586 4675 generic.go:334] "Generic (PLEG): container finished" podID="91e74f32-a9bd-4a15-9534-b9b9f31888a2" containerID="a730eb693d40aa4df94ec8ab603485ce5c4853a6a53fa0187ad2a3466ef5937c" exitCode=0 Mar 20 16:05:38 crc kubenswrapper[4675]: I0320 16:05:38.300668 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"91e74f32-a9bd-4a15-9534-b9b9f31888a2","Type":"ContainerDied","Data":"a730eb693d40aa4df94ec8ab603485ce5c4853a6a53fa0187ad2a3466ef5937c"} Mar 20 16:05:38 crc kubenswrapper[4675]: I0320 16:05:38.325006 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7n9r9" podStartSLOduration=2.745099762 podStartE2EDuration="38.324977786s" podCreationTimestamp="2026-03-20 16:05:00 +0000 UTC" firstStartedPulling="2026-03-20 16:05:01.738109836 +0000 UTC m=+221.771739373" lastFinishedPulling="2026-03-20 16:05:37.31798787 +0000 UTC m=+257.351617397" observedRunningTime="2026-03-20 16:05:38.321060423 +0000 UTC m=+258.354689970" watchObservedRunningTime="2026-03-20 16:05:38.324977786 +0000 UTC m=+258.358607323" Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.546530 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.719149 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kubelet-dir\") pod \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.719289 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "91e74f32-a9bd-4a15-9534-b9b9f31888a2" (UID: "91e74f32-a9bd-4a15-9534-b9b9f31888a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.719302 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kube-api-access\") pod \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\" (UID: \"91e74f32-a9bd-4a15-9534-b9b9f31888a2\") " Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.719632 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.725366 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "91e74f32-a9bd-4a15-9534-b9b9f31888a2" (UID: "91e74f32-a9bd-4a15-9534-b9b9f31888a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.832101 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91e74f32-a9bd-4a15-9534-b9b9f31888a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.881927 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5958c5996b-lx9q2"] Mar 20 16:05:39 crc kubenswrapper[4675]: I0320 16:05:39.977885 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q"] Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.312880 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" podUID="58fb26f6-10aa-4518-83fc-a502cf0f1f98" containerName="route-controller-manager" containerID="cri-o://6bbfd1f7cbfa0b452929267ed196af6d88d2be21cdcf6347fae4831ba7992ba5" gracePeriod=30 Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.313188 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.313865 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"91e74f32-a9bd-4a15-9534-b9b9f31888a2","Type":"ContainerDied","Data":"7a79387c2d40d6fbe79a7d8cab0802fdb31f52ac135bc91cb8aad27e2082d14d"} Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.313930 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a79387c2d40d6fbe79a7d8cab0802fdb31f52ac135bc91cb8aad27e2082d14d" Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.314983 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" podUID="289e0ec8-f0e1-42f2-bc63-0f970c1b188a" containerName="controller-manager" containerID="cri-o://d7ab5b7ec34253ea2411a4a0a4f8474dec8b999e0dd5f9cbb8848085d6ead747" gracePeriod=30 Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.948832 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:40 crc kubenswrapper[4675]: I0320 16:05:40.949123 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.331958 4675 generic.go:334] "Generic (PLEG): container finished" podID="58fb26f6-10aa-4518-83fc-a502cf0f1f98" containerID="6bbfd1f7cbfa0b452929267ed196af6d88d2be21cdcf6347fae4831ba7992ba5" exitCode=0 Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.332041 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" event={"ID":"58fb26f6-10aa-4518-83fc-a502cf0f1f98","Type":"ContainerDied","Data":"6bbfd1f7cbfa0b452929267ed196af6d88d2be21cdcf6347fae4831ba7992ba5"} Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.344170 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerStarted","Data":"35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa"} Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.348584 4675 generic.go:334] "Generic (PLEG): container finished" podID="289e0ec8-f0e1-42f2-bc63-0f970c1b188a" containerID="d7ab5b7ec34253ea2411a4a0a4f8474dec8b999e0dd5f9cbb8848085d6ead747" exitCode=0 Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.348634 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" event={"ID":"289e0ec8-f0e1-42f2-bc63-0f970c1b188a","Type":"ContainerDied","Data":"d7ab5b7ec34253ea2411a4a0a4f8474dec8b999e0dd5f9cbb8848085d6ead747"} Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.372611 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwsgc" podStartSLOduration=2.417903796 podStartE2EDuration="41.372593256s" podCreationTimestamp="2026-03-20 16:05:00 +0000 UTC" firstStartedPulling="2026-03-20 16:05:01.83590133 +0000 UTC m=+221.869530857" lastFinishedPulling="2026-03-20 16:05:40.79059078 +0000 UTC m=+260.824220317" observedRunningTime="2026-03-20 16:05:41.368914369 +0000 UTC m=+261.402543906" watchObservedRunningTime="2026-03-20 16:05:41.372593256 +0000 UTC m=+261.406222793" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.480629 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 16:05:41 crc kubenswrapper[4675]: E0320 16:05:41.481239 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e74f32-a9bd-4a15-9534-b9b9f31888a2" containerName="pruner" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.481262 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e74f32-a9bd-4a15-9534-b9b9f31888a2" containerName="pruner" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.481423 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e74f32-a9bd-4a15-9534-b9b9f31888a2" containerName="pruner" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.480719 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.481874 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.484826 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.485147 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.491089 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.515903 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx"] Mar 20 16:05:41 crc kubenswrapper[4675]: E0320 16:05:41.516268 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58fb26f6-10aa-4518-83fc-a502cf0f1f98" containerName="route-controller-manager" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.516289 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fb26f6-10aa-4518-83fc-a502cf0f1f98" containerName="route-controller-manager" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.516406 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="58fb26f6-10aa-4518-83fc-a502cf0f1f98" containerName="route-controller-manager" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.516860 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.530412 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx"] Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556196 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58fb26f6-10aa-4518-83fc-a502cf0f1f98-serving-cert\") pod \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556259 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9qgk\" (UniqueName: \"kubernetes.io/projected/58fb26f6-10aa-4518-83fc-a502cf0f1f98-kube-api-access-h9qgk\") pod \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556286 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-client-ca\") pod \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556310 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-config\") pod \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\" (UID: \"58fb26f6-10aa-4518-83fc-a502cf0f1f98\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556464 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-client-ca\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556487 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmrs\" (UniqueName: \"kubernetes.io/projected/bd92322a-c1ea-4e44-b7bc-150a93a29650-kube-api-access-4hmrs\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556518 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd92322a-c1ea-4e44-b7bc-150a93a29650-serving-cert\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556536 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556553 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-config\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556569 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0fa5466-5ff3-4b74-a932-5ee34be11884-kube-api-access\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.556603 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-var-lock\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.559959 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-config" (OuterVolumeSpecName: "config") pod "58fb26f6-10aa-4518-83fc-a502cf0f1f98" (UID: "58fb26f6-10aa-4518-83fc-a502cf0f1f98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.560255 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-client-ca" (OuterVolumeSpecName: "client-ca") pod "58fb26f6-10aa-4518-83fc-a502cf0f1f98" (UID: "58fb26f6-10aa-4518-83fc-a502cf0f1f98"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.562442 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fb26f6-10aa-4518-83fc-a502cf0f1f98-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58fb26f6-10aa-4518-83fc-a502cf0f1f98" (UID: "58fb26f6-10aa-4518-83fc-a502cf0f1f98"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.567954 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fb26f6-10aa-4518-83fc-a502cf0f1f98-kube-api-access-h9qgk" (OuterVolumeSpecName: "kube-api-access-h9qgk") pod "58fb26f6-10aa-4518-83fc-a502cf0f1f98" (UID: "58fb26f6-10aa-4518-83fc-a502cf0f1f98"). InnerVolumeSpecName "kube-api-access-h9qgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.578280 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.657593 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlv4v\" (UniqueName: \"kubernetes.io/projected/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-kube-api-access-tlv4v\") pod \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.657738 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-proxy-ca-bundles\") pod \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.657814 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-client-ca\") pod \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.657864 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-serving-cert\") pod \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.657917 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-config\") pod \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\" (UID: \"289e0ec8-f0e1-42f2-bc63-0f970c1b188a\") " Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658455 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd92322a-c1ea-4e44-b7bc-150a93a29650-serving-cert\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658505 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658506 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "289e0ec8-f0e1-42f2-bc63-0f970c1b188a" (UID: "289e0ec8-f0e1-42f2-bc63-0f970c1b188a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658523 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-config\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658557 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-client-ca" (OuterVolumeSpecName: "client-ca") pod "289e0ec8-f0e1-42f2-bc63-0f970c1b188a" (UID: "289e0ec8-f0e1-42f2-bc63-0f970c1b188a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658616 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0fa5466-5ff3-4b74-a932-5ee34be11884-kube-api-access\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658646 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-config" (OuterVolumeSpecName: "config") pod "289e0ec8-f0e1-42f2-bc63-0f970c1b188a" (UID: "289e0ec8-f0e1-42f2-bc63-0f970c1b188a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658617 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658689 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-var-lock\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658820 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-client-ca\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658848 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmrs\" (UniqueName: \"kubernetes.io/projected/bd92322a-c1ea-4e44-b7bc-150a93a29650-kube-api-access-4hmrs\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658928 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.658998 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58fb26f6-10aa-4518-83fc-a502cf0f1f98-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659010 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659019 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9qgk\" (UniqueName: \"kubernetes.io/projected/58fb26f6-10aa-4518-83fc-a502cf0f1f98-kube-api-access-h9qgk\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659031 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659040 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58fb26f6-10aa-4518-83fc-a502cf0f1f98-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659047 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659165 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-var-lock\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.659831 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-client-ca\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.660633 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "289e0ec8-f0e1-42f2-bc63-0f970c1b188a" (UID: "289e0ec8-f0e1-42f2-bc63-0f970c1b188a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.661150 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-config\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.661941 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd92322a-c1ea-4e44-b7bc-150a93a29650-serving-cert\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.662886 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-kube-api-access-tlv4v" (OuterVolumeSpecName: "kube-api-access-tlv4v") pod "289e0ec8-f0e1-42f2-bc63-0f970c1b188a" (UID: "289e0ec8-f0e1-42f2-bc63-0f970c1b188a"). InnerVolumeSpecName "kube-api-access-tlv4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.675096 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0fa5466-5ff3-4b74-a932-5ee34be11884-kube-api-access\") pod \"installer-9-crc\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.675157 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmrs\" (UniqueName: \"kubernetes.io/projected/bd92322a-c1ea-4e44-b7bc-150a93a29650-kube-api-access-4hmrs\") pod \"route-controller-manager-7c9cf68cd7-vb2fx\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.760091 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.760130 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlv4v\" (UniqueName: \"kubernetes.io/projected/289e0ec8-f0e1-42f2-bc63-0f970c1b188a-kube-api-access-tlv4v\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.874165 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:41 crc kubenswrapper[4675]: I0320 16:05:41.895435 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.358276 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" event={"ID":"289e0ec8-f0e1-42f2-bc63-0f970c1b188a","Type":"ContainerDied","Data":"e718bde1d120b7aa5cd1650c3fe9ebe179419adaa31610827dcf31e642f66eb1"} Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.360056 4675 scope.go:117] "RemoveContainer" containerID="d7ab5b7ec34253ea2411a4a0a4f8474dec8b999e0dd5f9cbb8848085d6ead747" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.358335 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5958c5996b-lx9q2" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.360552 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerStarted","Data":"0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff"} Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.362909 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.363003 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q" event={"ID":"58fb26f6-10aa-4518-83fc-a502cf0f1f98","Type":"ContainerDied","Data":"99a61ea53a480abee1e759967bf4a96a8959eedf9530751e266c469583dcb264"} Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.384933 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmc4p" podStartSLOduration=3.923141357 podStartE2EDuration="42.384916085s" podCreationTimestamp="2026-03-20 16:05:00 +0000 UTC" firstStartedPulling="2026-03-20 16:05:02.914047201 +0000 UTC m=+222.947676738" lastFinishedPulling="2026-03-20 16:05:41.375821939 +0000 UTC m=+261.409451466" observedRunningTime="2026-03-20 16:05:42.38267056 +0000 UTC m=+262.416300097" watchObservedRunningTime="2026-03-20 16:05:42.384916085 +0000 UTC m=+262.418545622" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.397695 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5958c5996b-lx9q2"] Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.401563 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5958c5996b-lx9q2"] Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.409471 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q"] Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.412347 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cc498856-bbq6q"] Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.419840 4675 scope.go:117] "RemoveContainer" containerID="6bbfd1f7cbfa0b452929267ed196af6d88d2be21cdcf6347fae4831ba7992ba5" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.516376 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7n9r9" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="registry-server" probeResult="failure" output=< Mar 20 16:05:42 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Mar 20 16:05:42 crc kubenswrapper[4675]: > Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.683293 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289e0ec8-f0e1-42f2-bc63-0f970c1b188a" path="/var/lib/kubelet/pods/289e0ec8-f0e1-42f2-bc63-0f970c1b188a/volumes" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.684111 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58fb26f6-10aa-4518-83fc-a502cf0f1f98" path="/var/lib/kubelet/pods/58fb26f6-10aa-4518-83fc-a502cf0f1f98/volumes" Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.977539 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 16:05:42 crc kubenswrapper[4675]: W0320 16:05:42.980391 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf0fa5466_5ff3_4b74_a932_5ee34be11884.slice/crio-88b25899424e93ad579502fdda25175805e2062c4114d74ad10181270dfe359c WatchSource:0}: Error finding container 88b25899424e93ad579502fdda25175805e2062c4114d74ad10181270dfe359c: Status 404 returned error can't find the container with id 88b25899424e93ad579502fdda25175805e2062c4114d74ad10181270dfe359c Mar 20 16:05:42 crc kubenswrapper[4675]: W0320 16:05:42.980935 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd92322a_c1ea_4e44_b7bc_150a93a29650.slice/crio-eeff404aaaf6adc13d6da19a482599a7b4e045fe37b107816d45a77d700dec6c WatchSource:0}: Error finding container eeff404aaaf6adc13d6da19a482599a7b4e045fe37b107816d45a77d700dec6c: Status 404 returned error can't find the container with id eeff404aaaf6adc13d6da19a482599a7b4e045fe37b107816d45a77d700dec6c Mar 20 16:05:42 crc kubenswrapper[4675]: I0320 16:05:42.981114 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx"] Mar 20 16:05:43 crc kubenswrapper[4675]: I0320 16:05:43.369862 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerStarted","Data":"7197ed9a940fb897482b38318055910b7f47dac23e1ab67d3983e01c680f79bc"} Mar 20 16:05:43 crc kubenswrapper[4675]: I0320 16:05:43.371297 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" event={"ID":"bd92322a-c1ea-4e44-b7bc-150a93a29650","Type":"ContainerStarted","Data":"eeff404aaaf6adc13d6da19a482599a7b4e045fe37b107816d45a77d700dec6c"} Mar 20 16:05:43 crc kubenswrapper[4675]: I0320 16:05:43.379660 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f0fa5466-5ff3-4b74-a932-5ee34be11884","Type":"ContainerStarted","Data":"88b25899424e93ad579502fdda25175805e2062c4114d74ad10181270dfe359c"} Mar 20 16:05:43 crc kubenswrapper[4675]: I0320 16:05:43.400567 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l5s99" podStartSLOduration=3.996532251 podStartE2EDuration="40.400546661s" podCreationTimestamp="2026-03-20 16:05:03 +0000 UTC" firstStartedPulling="2026-03-20 16:05:06.015090492 +0000 UTC m=+226.048720039" lastFinishedPulling="2026-03-20 16:05:42.419104912 +0000 UTC m=+262.452734449" observedRunningTime="2026-03-20 16:05:43.398908844 +0000 UTC m=+263.432538401" watchObservedRunningTime="2026-03-20 16:05:43.400546661 +0000 UTC m=+263.434176208" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.115422 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-575f7569d5-rcns4"] Mar 20 16:05:44 crc kubenswrapper[4675]: E0320 16:05:44.115707 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289e0ec8-f0e1-42f2-bc63-0f970c1b188a" containerName="controller-manager" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.115724 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="289e0ec8-f0e1-42f2-bc63-0f970c1b188a" containerName="controller-manager" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.115872 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="289e0ec8-f0e1-42f2-bc63-0f970c1b188a" containerName="controller-manager" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.116319 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.117829 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.118257 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.118644 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.118895 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.118983 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.123486 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.130747 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-575f7569d5-rcns4"] Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.156271 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.198469 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hg4l\" (UniqueName: \"kubernetes.io/projected/44da7f78-8444-48dd-ada1-ab8aa7fde155-kube-api-access-6hg4l\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.198521 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-client-ca\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.198538 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44da7f78-8444-48dd-ada1-ab8aa7fde155-serving-cert\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.198554 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-proxy-ca-bundles\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.198622 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-config\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.299962 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-config\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.300047 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hg4l\" (UniqueName: \"kubernetes.io/projected/44da7f78-8444-48dd-ada1-ab8aa7fde155-kube-api-access-6hg4l\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.300083 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-client-ca\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.300111 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44da7f78-8444-48dd-ada1-ab8aa7fde155-serving-cert\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.300127 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-proxy-ca-bundles\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.301266 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-client-ca\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.301425 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-proxy-ca-bundles\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.303063 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-config\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.314429 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44da7f78-8444-48dd-ada1-ab8aa7fde155-serving-cert\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.322797 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hg4l\" (UniqueName: \"kubernetes.io/projected/44da7f78-8444-48dd-ada1-ab8aa7fde155-kube-api-access-6hg4l\") pod \"controller-manager-575f7569d5-rcns4\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.389685 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerStarted","Data":"014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43"} Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.391649 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f0fa5466-5ff3-4b74-a932-5ee34be11884","Type":"ContainerStarted","Data":"db035d95a49d25cd35a989ac1d5f1cbfa8c875dc577a370a38ade2f28d556d66"} Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.392777 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" event={"ID":"bd92322a-c1ea-4e44-b7bc-150a93a29650","Type":"ContainerStarted","Data":"7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa"} Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.437219 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.448943 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.448987 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:44 crc kubenswrapper[4675]: I0320 16:05:44.857294 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-575f7569d5-rcns4"] Mar 20 16:05:44 crc kubenswrapper[4675]: W0320 16:05:44.862018 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44da7f78_8444_48dd_ada1_ab8aa7fde155.slice/crio-9964d8f2fcaa33e55bb3004bc683efebe4e13bb1dba5c4f9f15daccc7978be50 WatchSource:0}: Error finding container 9964d8f2fcaa33e55bb3004bc683efebe4e13bb1dba5c4f9f15daccc7978be50: Status 404 returned error can't find the container with id 9964d8f2fcaa33e55bb3004bc683efebe4e13bb1dba5c4f9f15daccc7978be50 Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.402864 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" event={"ID":"44da7f78-8444-48dd-ada1-ab8aa7fde155","Type":"ContainerStarted","Data":"b1179d4a45fe7130fa1c1f76b94f1e11f45b9283be1755f2ba1d704adec5f921"} Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.402937 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" event={"ID":"44da7f78-8444-48dd-ada1-ab8aa7fde155","Type":"ContainerStarted","Data":"9964d8f2fcaa33e55bb3004bc683efebe4e13bb1dba5c4f9f15daccc7978be50"} Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.434366 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.434345046 podStartE2EDuration="4.434345046s" podCreationTimestamp="2026-03-20 16:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:45.431750991 +0000 UTC m=+265.465380528" watchObservedRunningTime="2026-03-20 16:05:45.434345046 +0000 UTC m=+265.467974583" Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.462804 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" podStartSLOduration=6.462783817 podStartE2EDuration="6.462783817s" podCreationTimestamp="2026-03-20 16:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:45.459270866 +0000 UTC m=+265.492900403" watchObservedRunningTime="2026-03-20 16:05:45.462783817 +0000 UTC m=+265.496413354" Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.476893 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" podStartSLOduration=6.476877244 podStartE2EDuration="6.476877244s" podCreationTimestamp="2026-03-20 16:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:45.474092234 +0000 UTC m=+265.507721781" watchObservedRunningTime="2026-03-20 16:05:45.476877244 +0000 UTC m=+265.510506771" Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.491268 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l5s99" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="registry-server" probeResult="failure" output=< Mar 20 16:05:45 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Mar 20 16:05:45 crc kubenswrapper[4675]: > Mar 20 16:05:45 crc kubenswrapper[4675]: I0320 16:05:45.495138 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-blmfl" podStartSLOduration=4.912264481 podStartE2EDuration="42.495117261s" podCreationTimestamp="2026-03-20 16:05:03 +0000 UTC" firstStartedPulling="2026-03-20 16:05:06.021658211 +0000 UTC m=+226.055287748" lastFinishedPulling="2026-03-20 16:05:43.604511001 +0000 UTC m=+263.638140528" observedRunningTime="2026-03-20 16:05:45.494034229 +0000 UTC m=+265.527663766" watchObservedRunningTime="2026-03-20 16:05:45.495117261 +0000 UTC m=+265.528746798" Mar 20 16:05:46 crc kubenswrapper[4675]: I0320 16:05:46.408816 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:46 crc kubenswrapper[4675]: I0320 16:05:46.413344 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.428750 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-wld22" event={"ID":"c6d2332a-bd88-45d7-8645-63778001dd65","Type":"ContainerStarted","Data":"99e087d316e575725ebee1ad8e9129cd24309a60c1b113f3efe910b1e8259618"} Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.700731 4675 csr.go:261] certificate signing request csr-xjcpf is approved, waiting to be issued Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.711343 4675 csr.go:257] certificate signing request csr-xjcpf is issued Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.730669 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.730729 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.833020 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:50 crc kubenswrapper[4675]: I0320 16:05:50.998499 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.043518 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.200438 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.200753 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.259918 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.435561 4675 generic.go:334] "Generic (PLEG): container finished" podID="c6d2332a-bd88-45d7-8645-63778001dd65" containerID="99e087d316e575725ebee1ad8e9129cd24309a60c1b113f3efe910b1e8259618" exitCode=0 Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.435633 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-wld22" event={"ID":"c6d2332a-bd88-45d7-8645-63778001dd65","Type":"ContainerDied","Data":"99e087d316e575725ebee1ad8e9129cd24309a60c1b113f3efe910b1e8259618"} Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.437625 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerStarted","Data":"767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6"} Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.442106 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerStarted","Data":"6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72"} Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.490232 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.491072 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.713133 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 13:35:45.683314535 +0000 UTC Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.713170 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6429h29m53.97014641s for next certificate rotation Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.896389 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:51 crc kubenswrapper[4675]: I0320 16:05:51.901642 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.449748 4675 generic.go:334] "Generic (PLEG): container finished" podID="abca8440-77fa-48b9-a977-9bba2e267728" containerID="6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72" exitCode=0 Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.449877 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerDied","Data":"6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72"} Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.456750 4675 generic.go:334] "Generic (PLEG): container finished" podID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerID="dd0f23270fbde5306b4ca038a84799ef9d984cc6d4a159c3bd99e8ac86cb2551" exitCode=0 Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.456934 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jq5j" event={"ID":"e85ec396-6d81-4ad2-b269-315df42e61c4","Type":"ContainerDied","Data":"dd0f23270fbde5306b4ca038a84799ef9d984cc6d4a159c3bd99e8ac86cb2551"} Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.460221 4675 generic.go:334] "Generic (PLEG): container finished" podID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerID="767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6" exitCode=0 Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.460441 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerDied","Data":"767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6"} Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.714016 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 22:45:06.747533866 +0000 UTC Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.714067 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6126h39m14.033468961s for next certificate rotation Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.810799 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.850873 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/c6d2332a-bd88-45d7-8645-63778001dd65-kube-api-access-2xv82\") pod \"c6d2332a-bd88-45d7-8645-63778001dd65\" (UID: \"c6d2332a-bd88-45d7-8645-63778001dd65\") " Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.858666 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d2332a-bd88-45d7-8645-63778001dd65-kube-api-access-2xv82" (OuterVolumeSpecName: "kube-api-access-2xv82") pod "c6d2332a-bd88-45d7-8645-63778001dd65" (UID: "c6d2332a-bd88-45d7-8645-63778001dd65"). InnerVolumeSpecName "kube-api-access-2xv82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:52 crc kubenswrapper[4675]: I0320 16:05:52.958042 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/c6d2332a-bd88-45d7-8645-63778001dd65-kube-api-access-2xv82\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.470206 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-wld22" event={"ID":"c6d2332a-bd88-45d7-8645-63778001dd65","Type":"ContainerDied","Data":"8f7c0f92776fc5e06ed3dec3038d8b84e8d9f4731a9777197d1cfb1195fbdaa3"} Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.470254 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f7c0f92776fc5e06ed3dec3038d8b84e8d9f4731a9777197d1cfb1195fbdaa3" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.470537 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-wld22" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.472433 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jq5j" event={"ID":"e85ec396-6d81-4ad2-b269-315df42e61c4","Type":"ContainerStarted","Data":"95583a7526ead4cb034dda4c12c6f509a8256fd3e651b2a9dd628b9c60dcc619"} Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.475837 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerStarted","Data":"b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a"} Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.479432 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerStarted","Data":"43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29"} Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.511655 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4jq5j" podStartSLOduration=3.645980995 podStartE2EDuration="51.511636505s" podCreationTimestamp="2026-03-20 16:05:02 +0000 UTC" firstStartedPulling="2026-03-20 16:05:05.049019198 +0000 UTC m=+225.082648735" lastFinishedPulling="2026-03-20 16:05:52.914674688 +0000 UTC m=+272.948304245" observedRunningTime="2026-03-20 16:05:53.494550312 +0000 UTC m=+273.528179849" watchObservedRunningTime="2026-03-20 16:05:53.511636505 +0000 UTC m=+273.545266042" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.512339 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmc4p"] Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.527986 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b8vm6" podStartSLOduration=3.549310683 podStartE2EDuration="53.527967806s" podCreationTimestamp="2026-03-20 16:05:00 +0000 UTC" firstStartedPulling="2026-03-20 16:05:02.888580705 +0000 UTC m=+222.922210242" lastFinishedPulling="2026-03-20 16:05:52.867237828 +0000 UTC m=+272.900867365" observedRunningTime="2026-03-20 16:05:53.525728272 +0000 UTC m=+273.559357819" watchObservedRunningTime="2026-03-20 16:05:53.527967806 +0000 UTC m=+273.561597343" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.840745 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k8lpk" podStartSLOduration=2.907591833 podStartE2EDuration="51.840724077s" podCreationTimestamp="2026-03-20 16:05:02 +0000 UTC" firstStartedPulling="2026-03-20 16:05:03.943104434 +0000 UTC m=+223.976733971" lastFinishedPulling="2026-03-20 16:05:52.876236678 +0000 UTC m=+272.909866215" observedRunningTime="2026-03-20 16:05:53.544238146 +0000 UTC m=+273.577867683" watchObservedRunningTime="2026-03-20 16:05:53.840724077 +0000 UTC m=+273.874353614" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.943536 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.943581 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:53 crc kubenswrapper[4675]: I0320 16:05:53.986268 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:54 crc kubenswrapper[4675]: I0320 16:05:54.484888 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmc4p" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="registry-server" containerID="cri-o://0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff" gracePeriod=2 Mar 20 16:05:54 crc kubenswrapper[4675]: I0320 16:05:54.492942 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:54 crc kubenswrapper[4675]: I0320 16:05:54.523533 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:05:54 crc kubenswrapper[4675]: I0320 16:05:54.543549 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:54 crc kubenswrapper[4675]: I0320 16:05:54.991977 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.081075 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-utilities\") pod \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.081157 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqhln\" (UniqueName: \"kubernetes.io/projected/acb87e24-d219-4f7d-b28b-689cb6ccaa56-kube-api-access-nqhln\") pod \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.081255 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-catalog-content\") pod \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\" (UID: \"acb87e24-d219-4f7d-b28b-689cb6ccaa56\") " Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.082445 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-utilities" (OuterVolumeSpecName: "utilities") pod "acb87e24-d219-4f7d-b28b-689cb6ccaa56" (UID: "acb87e24-d219-4f7d-b28b-689cb6ccaa56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.096091 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb87e24-d219-4f7d-b28b-689cb6ccaa56-kube-api-access-nqhln" (OuterVolumeSpecName: "kube-api-access-nqhln") pod "acb87e24-d219-4f7d-b28b-689cb6ccaa56" (UID: "acb87e24-d219-4f7d-b28b-689cb6ccaa56"). InnerVolumeSpecName "kube-api-access-nqhln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.143420 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acb87e24-d219-4f7d-b28b-689cb6ccaa56" (UID: "acb87e24-d219-4f7d-b28b-689cb6ccaa56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.182306 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.182579 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb87e24-d219-4f7d-b28b-689cb6ccaa56-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.182644 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqhln\" (UniqueName: \"kubernetes.io/projected/acb87e24-d219-4f7d-b28b-689cb6ccaa56-kube-api-access-nqhln\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.493505 4675 generic.go:334] "Generic (PLEG): container finished" podID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerID="0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff" exitCode=0 Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.493541 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerDied","Data":"0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff"} Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.493632 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmc4p" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.493972 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmc4p" event={"ID":"acb87e24-d219-4f7d-b28b-689cb6ccaa56","Type":"ContainerDied","Data":"f267d036ee9d2c8b29228395201d091413f3be28a298af2e39692749b44b4937"} Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.494021 4675 scope.go:117] "RemoveContainer" containerID="0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.520463 4675 scope.go:117] "RemoveContainer" containerID="3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.531582 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmc4p"] Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.535500 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmc4p"] Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.559676 4675 scope.go:117] "RemoveContainer" containerID="543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.583497 4675 scope.go:117] "RemoveContainer" containerID="0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff" Mar 20 16:05:55 crc kubenswrapper[4675]: E0320 16:05:55.584087 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff\": container with ID starting with 0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff not found: ID does not exist" containerID="0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.584146 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff"} err="failed to get container status \"0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff\": rpc error: code = NotFound desc = could not find container \"0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff\": container with ID starting with 0508bd93aefc43f5167537cc7e2fadde8883641f15f39e8a8254c765d8ba98ff not found: ID does not exist" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.584192 4675 scope.go:117] "RemoveContainer" containerID="3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457" Mar 20 16:05:55 crc kubenswrapper[4675]: E0320 16:05:55.584736 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457\": container with ID starting with 3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457 not found: ID does not exist" containerID="3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.584858 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457"} err="failed to get container status \"3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457\": rpc error: code = NotFound desc = could not find container \"3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457\": container with ID starting with 3b5feb96d88b808b306f78d9840c417e5a3f4e2609d10325e0766f9d6c4e7457 not found: ID does not exist" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.584924 4675 scope.go:117] "RemoveContainer" containerID="543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85" Mar 20 16:05:55 crc kubenswrapper[4675]: E0320 16:05:55.586265 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85\": container with ID starting with 543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85 not found: ID does not exist" containerID="543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85" Mar 20 16:05:55 crc kubenswrapper[4675]: I0320 16:05:55.586302 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85"} err="failed to get container status \"543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85\": rpc error: code = NotFound desc = could not find container \"543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85\": container with ID starting with 543e396b116db7e424cebe159ab60b032eada66cd70a73bfefc0f0d1027cea85 not found: ID does not exist" Mar 20 16:05:56 crc kubenswrapper[4675]: I0320 16:05:56.680568 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" path="/var/lib/kubelet/pods/acb87e24-d219-4f7d-b28b-689cb6ccaa56/volumes" Mar 20 16:05:57 crc kubenswrapper[4675]: I0320 16:05:57.916329 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5s99"] Mar 20 16:05:57 crc kubenswrapper[4675]: I0320 16:05:57.917141 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l5s99" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="registry-server" containerID="cri-o://7197ed9a940fb897482b38318055910b7f47dac23e1ab67d3983e01c680f79bc" gracePeriod=2 Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.520405 4675 generic.go:334] "Generic (PLEG): container finished" podID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerID="7197ed9a940fb897482b38318055910b7f47dac23e1ab67d3983e01c680f79bc" exitCode=0 Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.520838 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerDied","Data":"7197ed9a940fb897482b38318055910b7f47dac23e1ab67d3983e01c680f79bc"} Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.903789 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.943415 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-utilities\") pod \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.943546 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-catalog-content\") pod \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.943624 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q88p6\" (UniqueName: \"kubernetes.io/projected/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-kube-api-access-q88p6\") pod \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\" (UID: \"fdfeb04b-a650-4820-86ad-84e4cdd56e3b\") " Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.944618 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-utilities" (OuterVolumeSpecName: "utilities") pod "fdfeb04b-a650-4820-86ad-84e4cdd56e3b" (UID: "fdfeb04b-a650-4820-86ad-84e4cdd56e3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:58 crc kubenswrapper[4675]: I0320 16:05:58.949867 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-kube-api-access-q88p6" (OuterVolumeSpecName: "kube-api-access-q88p6") pod "fdfeb04b-a650-4820-86ad-84e4cdd56e3b" (UID: "fdfeb04b-a650-4820-86ad-84e4cdd56e3b"). InnerVolumeSpecName "kube-api-access-q88p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.045225 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q88p6\" (UniqueName: \"kubernetes.io/projected/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-kube-api-access-q88p6\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.045267 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.084151 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdfeb04b-a650-4820-86ad-84e4cdd56e3b" (UID: "fdfeb04b-a650-4820-86ad-84e4cdd56e3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.146987 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdfeb04b-a650-4820-86ad-84e4cdd56e3b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.529560 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5s99" event={"ID":"fdfeb04b-a650-4820-86ad-84e4cdd56e3b","Type":"ContainerDied","Data":"092e6bdf0b4c1a6d485a0bb192d6667c6344e7e72ccada2b3181b94fc3e242ef"} Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.529632 4675 scope.go:117] "RemoveContainer" containerID="7197ed9a940fb897482b38318055910b7f47dac23e1ab67d3983e01c680f79bc" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.529668 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5s99" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.547479 4675 scope.go:117] "RemoveContainer" containerID="598994fcfb2f44148b5dfc8623c92a129b6fb7af9df790147623469adc40325d" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.565401 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5s99"] Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.572119 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l5s99"] Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.588010 4675 scope.go:117] "RemoveContainer" containerID="57c749916a253ec6fa2fe5604f71cefa04578b75c4e4028cfa80a5453b5b4a2f" Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.905692 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-575f7569d5-rcns4"] Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.905940 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" podUID="44da7f78-8444-48dd-ada1-ab8aa7fde155" containerName="controller-manager" containerID="cri-o://b1179d4a45fe7130fa1c1f76b94f1e11f45b9283be1755f2ba1d704adec5f921" gracePeriod=30 Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.924644 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx"] Mar 20 16:05:59 crc kubenswrapper[4675]: I0320 16:05:59.924989 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" podUID="bd92322a-c1ea-4e44-b7bc-150a93a29650" containerName="route-controller-manager" containerID="cri-o://7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa" gracePeriod=30 Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130641 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567046-mvnw2"] Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.130890 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d2332a-bd88-45d7-8645-63778001dd65" containerName="oc" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130903 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d2332a-bd88-45d7-8645-63778001dd65" containerName="oc" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.130913 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130919 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.130929 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="extract-utilities" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130935 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="extract-utilities" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.130943 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="extract-content" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130949 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="extract-content" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.130961 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="extract-content" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130967 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="extract-content" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.130976 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.130982 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.131005 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="extract-utilities" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.131013 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="extract-utilities" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.131121 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb87e24-d219-4f7d-b28b-689cb6ccaa56" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.131131 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.131140 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d2332a-bd88-45d7-8645-63778001dd65" containerName="oc" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.131574 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.134787 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.135017 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.138458 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-mvnw2"] Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.141639 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.162722 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9948\" (UniqueName: \"kubernetes.io/projected/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6-kube-api-access-f9948\") pod \"auto-csr-approver-29567046-mvnw2\" (UID: \"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6\") " pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.268338 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9948\" (UniqueName: \"kubernetes.io/projected/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6-kube-api-access-f9948\") pod \"auto-csr-approver-29567046-mvnw2\" (UID: \"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6\") " pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.305863 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9948\" (UniqueName: \"kubernetes.io/projected/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6-kube-api-access-f9948\") pod \"auto-csr-approver-29567046-mvnw2\" (UID: \"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6\") " pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.420877 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.449561 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.487336 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd92322a-c1ea-4e44-b7bc-150a93a29650-serving-cert\") pod \"bd92322a-c1ea-4e44-b7bc-150a93a29650\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.487416 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-client-ca\") pod \"bd92322a-c1ea-4e44-b7bc-150a93a29650\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.487465 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-config\") pod \"bd92322a-c1ea-4e44-b7bc-150a93a29650\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.487520 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmrs\" (UniqueName: \"kubernetes.io/projected/bd92322a-c1ea-4e44-b7bc-150a93a29650-kube-api-access-4hmrs\") pod \"bd92322a-c1ea-4e44-b7bc-150a93a29650\" (UID: \"bd92322a-c1ea-4e44-b7bc-150a93a29650\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.489912 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd92322a-c1ea-4e44-b7bc-150a93a29650" (UID: "bd92322a-c1ea-4e44-b7bc-150a93a29650"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.490019 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-config" (OuterVolumeSpecName: "config") pod "bd92322a-c1ea-4e44-b7bc-150a93a29650" (UID: "bd92322a-c1ea-4e44-b7bc-150a93a29650"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.491391 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd92322a-c1ea-4e44-b7bc-150a93a29650-kube-api-access-4hmrs" (OuterVolumeSpecName: "kube-api-access-4hmrs") pod "bd92322a-c1ea-4e44-b7bc-150a93a29650" (UID: "bd92322a-c1ea-4e44-b7bc-150a93a29650"). InnerVolumeSpecName "kube-api-access-4hmrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.501505 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd92322a-c1ea-4e44-b7bc-150a93a29650-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd92322a-c1ea-4e44-b7bc-150a93a29650" (UID: "bd92322a-c1ea-4e44-b7bc-150a93a29650"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.540463 4675 generic.go:334] "Generic (PLEG): container finished" podID="bd92322a-c1ea-4e44-b7bc-150a93a29650" containerID="7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa" exitCode=0 Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.540558 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" event={"ID":"bd92322a-c1ea-4e44-b7bc-150a93a29650","Type":"ContainerDied","Data":"7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa"} Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.540598 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" event={"ID":"bd92322a-c1ea-4e44-b7bc-150a93a29650","Type":"ContainerDied","Data":"eeff404aaaf6adc13d6da19a482599a7b4e045fe37b107816d45a77d700dec6c"} Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.540625 4675 scope.go:117] "RemoveContainer" containerID="7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.540809 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.543602 4675 generic.go:334] "Generic (PLEG): container finished" podID="44da7f78-8444-48dd-ada1-ab8aa7fde155" containerID="b1179d4a45fe7130fa1c1f76b94f1e11f45b9283be1755f2ba1d704adec5f921" exitCode=0 Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.543647 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" event={"ID":"44da7f78-8444-48dd-ada1-ab8aa7fde155","Type":"ContainerDied","Data":"b1179d4a45fe7130fa1c1f76b94f1e11f45b9283be1755f2ba1d704adec5f921"} Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.565898 4675 scope.go:117] "RemoveContainer" containerID="7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa" Mar 20 16:06:00 crc kubenswrapper[4675]: E0320 16:06:00.568806 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa\": container with ID starting with 7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa not found: ID does not exist" containerID="7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.568839 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa"} err="failed to get container status \"7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa\": rpc error: code = NotFound desc = could not find container \"7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa\": container with ID starting with 7feceb5f830e55801711af5bc0fcdf8ac24ad0df8502c104397217ef5ba1efaa not found: ID does not exist" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.588738 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.588782 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmrs\" (UniqueName: \"kubernetes.io/projected/bd92322a-c1ea-4e44-b7bc-150a93a29650-kube-api-access-4hmrs\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.588794 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd92322a-c1ea-4e44-b7bc-150a93a29650-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.588803 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd92322a-c1ea-4e44-b7bc-150a93a29650-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.614122 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.641875 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx"] Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.650533 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9cf68cd7-vb2fx"] Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.680119 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd92322a-c1ea-4e44-b7bc-150a93a29650" path="/var/lib/kubelet/pods/bd92322a-c1ea-4e44-b7bc-150a93a29650/volumes" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.680666 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfeb04b-a650-4820-86ad-84e4cdd56e3b" path="/var/lib/kubelet/pods/fdfeb04b-a650-4820-86ad-84e4cdd56e3b/volumes" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.791218 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-config\") pod \"44da7f78-8444-48dd-ada1-ab8aa7fde155\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.791274 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44da7f78-8444-48dd-ada1-ab8aa7fde155-serving-cert\") pod \"44da7f78-8444-48dd-ada1-ab8aa7fde155\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.791347 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hg4l\" (UniqueName: \"kubernetes.io/projected/44da7f78-8444-48dd-ada1-ab8aa7fde155-kube-api-access-6hg4l\") pod \"44da7f78-8444-48dd-ada1-ab8aa7fde155\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.791384 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-client-ca\") pod \"44da7f78-8444-48dd-ada1-ab8aa7fde155\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.791441 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-proxy-ca-bundles\") pod \"44da7f78-8444-48dd-ada1-ab8aa7fde155\" (UID: \"44da7f78-8444-48dd-ada1-ab8aa7fde155\") " Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.792093 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-client-ca" (OuterVolumeSpecName: "client-ca") pod "44da7f78-8444-48dd-ada1-ab8aa7fde155" (UID: "44da7f78-8444-48dd-ada1-ab8aa7fde155"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.792305 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "44da7f78-8444-48dd-ada1-ab8aa7fde155" (UID: "44da7f78-8444-48dd-ada1-ab8aa7fde155"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.792580 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-config" (OuterVolumeSpecName: "config") pod "44da7f78-8444-48dd-ada1-ab8aa7fde155" (UID: "44da7f78-8444-48dd-ada1-ab8aa7fde155"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.795901 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44da7f78-8444-48dd-ada1-ab8aa7fde155-kube-api-access-6hg4l" (OuterVolumeSpecName: "kube-api-access-6hg4l") pod "44da7f78-8444-48dd-ada1-ab8aa7fde155" (UID: "44da7f78-8444-48dd-ada1-ab8aa7fde155"). InnerVolumeSpecName "kube-api-access-6hg4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.796393 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44da7f78-8444-48dd-ada1-ab8aa7fde155-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44da7f78-8444-48dd-ada1-ab8aa7fde155" (UID: "44da7f78-8444-48dd-ada1-ab8aa7fde155"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.892834 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-mvnw2"] Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.892960 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.892982 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.892991 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44da7f78-8444-48dd-ada1-ab8aa7fde155-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.893000 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hg4l\" (UniqueName: \"kubernetes.io/projected/44da7f78-8444-48dd-ada1-ab8aa7fde155-kube-api-access-6hg4l\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: I0320 16:06:00.893011 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44da7f78-8444-48dd-ada1-ab8aa7fde155-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:00 crc kubenswrapper[4675]: W0320 16:06:00.898320 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa4bf42_3c65_48fb_99c4_0c6d9c2badc6.slice/crio-bc1cb3d645e99736e05e9edc1a2d5bb77c8cc4ce62ba66b5d95226eec253a356 WatchSource:0}: Error finding container bc1cb3d645e99736e05e9edc1a2d5bb77c8cc4ce62ba66b5d95226eec253a356: Status 404 returned error can't find the container with id bc1cb3d645e99736e05e9edc1a2d5bb77c8cc4ce62ba66b5d95226eec253a356 Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.128295 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-899959959-kjx5x"] Mar 20 16:06:01 crc kubenswrapper[4675]: E0320 16:06:01.128624 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44da7f78-8444-48dd-ada1-ab8aa7fde155" containerName="controller-manager" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.128644 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="44da7f78-8444-48dd-ada1-ab8aa7fde155" containerName="controller-manager" Mar 20 16:06:01 crc kubenswrapper[4675]: E0320 16:06:01.128683 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd92322a-c1ea-4e44-b7bc-150a93a29650" containerName="route-controller-manager" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.128696 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd92322a-c1ea-4e44-b7bc-150a93a29650" containerName="route-controller-manager" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.128885 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="44da7f78-8444-48dd-ada1-ab8aa7fde155" containerName="controller-manager" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.128963 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd92322a-c1ea-4e44-b7bc-150a93a29650" containerName="route-controller-manager" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.129494 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.131588 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57bf785689-chrg4"] Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.132453 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.132939 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.133090 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.133287 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.137360 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-899959959-kjx5x"] Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.137506 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.137526 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.137851 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.153659 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bf785689-chrg4"] Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.296693 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-config\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.296732 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-client-ca\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.296798 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f92b2056-0130-42e2-b4f2-4c688706b379-serving-cert\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.297061 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1db8415-be58-430e-b6b8-2babaabb6396-serving-cert\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.297208 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-config\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.297331 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-client-ca\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.297426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mksz6\" (UniqueName: \"kubernetes.io/projected/a1db8415-be58-430e-b6b8-2babaabb6396-kube-api-access-mksz6\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.297457 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-proxy-ca-bundles\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.297483 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwdlk\" (UniqueName: \"kubernetes.io/projected/f92b2056-0130-42e2-b4f2-4c688706b379-kube-api-access-mwdlk\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.342758 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.342830 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.382527 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.398933 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1db8415-be58-430e-b6b8-2babaabb6396-serving-cert\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.398966 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f92b2056-0130-42e2-b4f2-4c688706b379-serving-cert\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399000 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-config\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399044 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-client-ca\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mksz6\" (UniqueName: \"kubernetes.io/projected/a1db8415-be58-430e-b6b8-2babaabb6396-kube-api-access-mksz6\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399088 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-proxy-ca-bundles\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399111 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwdlk\" (UniqueName: \"kubernetes.io/projected/f92b2056-0130-42e2-b4f2-4c688706b379-kube-api-access-mwdlk\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399137 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-config\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.399154 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-client-ca\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.400264 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-client-ca\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.401598 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-client-ca\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.401716 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-proxy-ca-bundles\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.401940 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-config\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.402060 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-config\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.403591 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f92b2056-0130-42e2-b4f2-4c688706b379-serving-cert\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.403684 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1db8415-be58-430e-b6b8-2babaabb6396-serving-cert\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.414611 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwdlk\" (UniqueName: \"kubernetes.io/projected/f92b2056-0130-42e2-b4f2-4c688706b379-kube-api-access-mwdlk\") pod \"route-controller-manager-899959959-kjx5x\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.415254 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mksz6\" (UniqueName: \"kubernetes.io/projected/a1db8415-be58-430e-b6b8-2babaabb6396-kube-api-access-mksz6\") pod \"controller-manager-57bf785689-chrg4\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.457175 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.469725 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.552308 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" event={"ID":"44da7f78-8444-48dd-ada1-ab8aa7fde155","Type":"ContainerDied","Data":"9964d8f2fcaa33e55bb3004bc683efebe4e13bb1dba5c4f9f15daccc7978be50"} Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.552367 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f7569d5-rcns4" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.552383 4675 scope.go:117] "RemoveContainer" containerID="b1179d4a45fe7130fa1c1f76b94f1e11f45b9283be1755f2ba1d704adec5f921" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.558049 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" event={"ID":"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6","Type":"ContainerStarted","Data":"bc1cb3d645e99736e05e9edc1a2d5bb77c8cc4ce62ba66b5d95226eec253a356"} Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.601014 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-575f7569d5-rcns4"] Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.615214 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:06:01 crc kubenswrapper[4675]: I0320 16:06:01.617533 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-575f7569d5-rcns4"] Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.052843 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-899959959-kjx5x"] Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.058475 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bf785689-chrg4"] Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.569416 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" event={"ID":"f92b2056-0130-42e2-b4f2-4c688706b379","Type":"ContainerStarted","Data":"732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5"} Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.569834 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" event={"ID":"f92b2056-0130-42e2-b4f2-4c688706b379","Type":"ContainerStarted","Data":"0234f90cc51a158d14fa161f38d14135c91dbfcf30b802e6444d31dec4b3d794"} Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.571314 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" event={"ID":"a1db8415-be58-430e-b6b8-2babaabb6396","Type":"ContainerStarted","Data":"8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d"} Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.571341 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" event={"ID":"a1db8415-be58-430e-b6b8-2babaabb6396","Type":"ContainerStarted","Data":"d1a64f3544927f30e6828d72e0f782853b0abbbddeca81bb0a73ea1663137706"} Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.684746 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44da7f78-8444-48dd-ada1-ab8aa7fde155" path="/var/lib/kubelet/pods/44da7f78-8444-48dd-ada1-ab8aa7fde155/volumes" Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.969832 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:06:02 crc kubenswrapper[4675]: I0320 16:06:02.969923 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.028338 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.349625 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.350409 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.407407 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.508081 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b8vm6"] Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.579753 4675 generic.go:334] "Generic (PLEG): container finished" podID="9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6" containerID="9364185492eac3fab8536dff3ddb647c52012925cb9e2d0f978b272cc38088fa" exitCode=0 Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.580003 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" event={"ID":"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6","Type":"ContainerDied","Data":"9364185492eac3fab8536dff3ddb647c52012925cb9e2d0f978b272cc38088fa"} Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.580552 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.580741 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b8vm6" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="registry-server" containerID="cri-o://43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29" gracePeriod=2 Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.594478 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.627941 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.629373 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.630242 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" podStartSLOduration=4.630231825 podStartE2EDuration="4.630231825s" podCreationTimestamp="2026-03-20 16:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:03.629910116 +0000 UTC m=+283.663539653" watchObservedRunningTime="2026-03-20 16:06:03.630231825 +0000 UTC m=+283.663861362" Mar 20 16:06:03 crc kubenswrapper[4675]: I0320 16:06:03.650171 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" podStartSLOduration=4.650157061 podStartE2EDuration="4.650157061s" podCreationTimestamp="2026-03-20 16:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:03.649850202 +0000 UTC m=+283.683479739" watchObservedRunningTime="2026-03-20 16:06:03.650157061 +0000 UTC m=+283.683786598" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.065727 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.164516 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-catalog-content\") pod \"abca8440-77fa-48b9-a977-9bba2e267728\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.164606 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-utilities\") pod \"abca8440-77fa-48b9-a977-9bba2e267728\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.164722 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9n9\" (UniqueName: \"kubernetes.io/projected/abca8440-77fa-48b9-a977-9bba2e267728-kube-api-access-sw9n9\") pod \"abca8440-77fa-48b9-a977-9bba2e267728\" (UID: \"abca8440-77fa-48b9-a977-9bba2e267728\") " Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.165472 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-utilities" (OuterVolumeSpecName: "utilities") pod "abca8440-77fa-48b9-a977-9bba2e267728" (UID: "abca8440-77fa-48b9-a977-9bba2e267728"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.170230 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abca8440-77fa-48b9-a977-9bba2e267728-kube-api-access-sw9n9" (OuterVolumeSpecName: "kube-api-access-sw9n9") pod "abca8440-77fa-48b9-a977-9bba2e267728" (UID: "abca8440-77fa-48b9-a977-9bba2e267728"). InnerVolumeSpecName "kube-api-access-sw9n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.217125 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abca8440-77fa-48b9-a977-9bba2e267728" (UID: "abca8440-77fa-48b9-a977-9bba2e267728"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.265938 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.265976 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw9n9\" (UniqueName: \"kubernetes.io/projected/abca8440-77fa-48b9-a977-9bba2e267728-kube-api-access-sw9n9\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.265988 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abca8440-77fa-48b9-a977-9bba2e267728-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.425270 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.425363 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.425441 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.426522 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.426643 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995" gracePeriod=600 Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.588456 4675 generic.go:334] "Generic (PLEG): container finished" podID="abca8440-77fa-48b9-a977-9bba2e267728" containerID="43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29" exitCode=0 Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.588563 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerDied","Data":"43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29"} Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.588594 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8vm6" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.588623 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8vm6" event={"ID":"abca8440-77fa-48b9-a977-9bba2e267728","Type":"ContainerDied","Data":"c62045a7eb5f427789ee6e0f932fd975272d29132df0528cf3741b6a9e895f44"} Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.588649 4675 scope.go:117] "RemoveContainer" containerID="43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.594122 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995" exitCode=0 Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.594254 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995"} Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.605575 4675 scope.go:117] "RemoveContainer" containerID="6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.616445 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b8vm6"] Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.618368 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b8vm6"] Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.658178 4675 scope.go:117] "RemoveContainer" containerID="2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.679518 4675 scope.go:117] "RemoveContainer" containerID="43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29" Mar 20 16:06:04 crc kubenswrapper[4675]: E0320 16:06:04.682182 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29\": container with ID starting with 43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29 not found: ID does not exist" containerID="43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.682230 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29"} err="failed to get container status \"43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29\": rpc error: code = NotFound desc = could not find container \"43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29\": container with ID starting with 43e94d363a1d8032d3e4a8c3157961bca130331c734ab97d56931a929fbfdb29 not found: ID does not exist" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.682266 4675 scope.go:117] "RemoveContainer" containerID="6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72" Mar 20 16:06:04 crc kubenswrapper[4675]: E0320 16:06:04.682843 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72\": container with ID starting with 6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72 not found: ID does not exist" containerID="6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.682875 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72"} err="failed to get container status \"6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72\": rpc error: code = NotFound desc = could not find container \"6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72\": container with ID starting with 6712a350b942ba20e63e451b0be737c8eccbaad57a6f4e70c3ff213c131d6e72 not found: ID does not exist" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.682900 4675 scope.go:117] "RemoveContainer" containerID="2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9" Mar 20 16:06:04 crc kubenswrapper[4675]: E0320 16:06:04.683209 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9\": container with ID starting with 2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9 not found: ID does not exist" containerID="2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.683234 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9"} err="failed to get container status \"2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9\": rpc error: code = NotFound desc = could not find container \"2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9\": container with ID starting with 2375eb3d521489a39532df6feb836dae5ac18f0aa33276ebfb8a7c3483af19a9 not found: ID does not exist" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.683580 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abca8440-77fa-48b9-a977-9bba2e267728" path="/var/lib/kubelet/pods/abca8440-77fa-48b9-a977-9bba2e267728/volumes" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.900798 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.974998 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9948\" (UniqueName: \"kubernetes.io/projected/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6-kube-api-access-f9948\") pod \"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6\" (UID: \"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6\") " Mar 20 16:06:04 crc kubenswrapper[4675]: I0320 16:06:04.982942 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6-kube-api-access-f9948" (OuterVolumeSpecName: "kube-api-access-f9948") pod "9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6" (UID: "9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6"). InnerVolumeSpecName "kube-api-access-f9948". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.076973 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9948\" (UniqueName: \"kubernetes.io/projected/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6-kube-api-access-f9948\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.311402 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jq5j"] Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.601533 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" event={"ID":"9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6","Type":"ContainerDied","Data":"bc1cb3d645e99736e05e9edc1a2d5bb77c8cc4ce62ba66b5d95226eec253a356"} Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.601596 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1cb3d645e99736e05e9edc1a2d5bb77c8cc4ce62ba66b5d95226eec253a356" Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.601545 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-mvnw2" Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.606945 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"67b506092b500b63a2b5d18168ce40f7a503401a387fe638308c8230ccbef555"} Mar 20 16:06:05 crc kubenswrapper[4675]: I0320 16:06:05.608378 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4jq5j" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="registry-server" containerID="cri-o://95583a7526ead4cb034dda4c12c6f509a8256fd3e651b2a9dd628b9c60dcc619" gracePeriod=2 Mar 20 16:06:06 crc kubenswrapper[4675]: I0320 16:06:06.618044 4675 generic.go:334] "Generic (PLEG): container finished" podID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerID="95583a7526ead4cb034dda4c12c6f509a8256fd3e651b2a9dd628b9c60dcc619" exitCode=0 Mar 20 16:06:06 crc kubenswrapper[4675]: I0320 16:06:06.618151 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jq5j" event={"ID":"e85ec396-6d81-4ad2-b269-315df42e61c4","Type":"ContainerDied","Data":"95583a7526ead4cb034dda4c12c6f509a8256fd3e651b2a9dd628b9c60dcc619"} Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.165328 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.230571 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxpfv\" (UniqueName: \"kubernetes.io/projected/e85ec396-6d81-4ad2-b269-315df42e61c4-kube-api-access-vxpfv\") pod \"e85ec396-6d81-4ad2-b269-315df42e61c4\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.230630 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-utilities\") pod \"e85ec396-6d81-4ad2-b269-315df42e61c4\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.230683 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-catalog-content\") pod \"e85ec396-6d81-4ad2-b269-315df42e61c4\" (UID: \"e85ec396-6d81-4ad2-b269-315df42e61c4\") " Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.231629 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-utilities" (OuterVolumeSpecName: "utilities") pod "e85ec396-6d81-4ad2-b269-315df42e61c4" (UID: "e85ec396-6d81-4ad2-b269-315df42e61c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.240994 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85ec396-6d81-4ad2-b269-315df42e61c4-kube-api-access-vxpfv" (OuterVolumeSpecName: "kube-api-access-vxpfv") pod "e85ec396-6d81-4ad2-b269-315df42e61c4" (UID: "e85ec396-6d81-4ad2-b269-315df42e61c4"). InnerVolumeSpecName "kube-api-access-vxpfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.259494 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e85ec396-6d81-4ad2-b269-315df42e61c4" (UID: "e85ec396-6d81-4ad2-b269-315df42e61c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.331956 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxpfv\" (UniqueName: \"kubernetes.io/projected/e85ec396-6d81-4ad2-b269-315df42e61c4-kube-api-access-vxpfv\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.331990 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.332001 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ec396-6d81-4ad2-b269-315df42e61c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.626751 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jq5j" event={"ID":"e85ec396-6d81-4ad2-b269-315df42e61c4","Type":"ContainerDied","Data":"1a8814c23df2d3eb1cccc52cd6c3edbdbfd5d21ed7c4764640893f9f012e8d58"} Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.626810 4675 scope.go:117] "RemoveContainer" containerID="95583a7526ead4cb034dda4c12c6f509a8256fd3e651b2a9dd628b9c60dcc619" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.626825 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jq5j" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.646351 4675 scope.go:117] "RemoveContainer" containerID="dd0f23270fbde5306b4ca038a84799ef9d984cc6d4a159c3bd99e8ac86cb2551" Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.654530 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jq5j"] Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.659307 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jq5j"] Mar 20 16:06:07 crc kubenswrapper[4675]: I0320 16:06:07.672609 4675 scope.go:117] "RemoveContainer" containerID="47e353cc2637519311e8e9b67b7749e422bcdc10d726766c4ae780531e108d33" Mar 20 16:06:08 crc kubenswrapper[4675]: I0320 16:06:08.681468 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" path="/var/lib/kubelet/pods/e85ec396-6d81-4ad2-b269-315df42e61c4/volumes" Mar 20 16:06:11 crc kubenswrapper[4675]: I0320 16:06:11.458211 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:11 crc kubenswrapper[4675]: I0320 16:06:11.465984 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:12 crc kubenswrapper[4675]: I0320 16:06:12.667268 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rpvlc"] Mar 20 16:06:19 crc kubenswrapper[4675]: I0320 16:06:19.887824 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bf785689-chrg4"] Mar 20 16:06:19 crc kubenswrapper[4675]: I0320 16:06:19.888627 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" podUID="a1db8415-be58-430e-b6b8-2babaabb6396" containerName="controller-manager" containerID="cri-o://8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d" gracePeriod=30 Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.013086 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-899959959-kjx5x"] Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.013629 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" podUID="f92b2056-0130-42e2-b4f2-4c688706b379" containerName="route-controller-manager" containerID="cri-o://732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5" gracePeriod=30 Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.467554 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.473247 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.503467 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-proxy-ca-bundles\") pod \"a1db8415-be58-430e-b6b8-2babaabb6396\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.503593 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-client-ca\") pod \"a1db8415-be58-430e-b6b8-2babaabb6396\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.503663 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mksz6\" (UniqueName: \"kubernetes.io/projected/a1db8415-be58-430e-b6b8-2babaabb6396-kube-api-access-mksz6\") pod \"a1db8415-be58-430e-b6b8-2babaabb6396\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.503689 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1db8415-be58-430e-b6b8-2babaabb6396-serving-cert\") pod \"a1db8415-be58-430e-b6b8-2babaabb6396\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.503712 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-config\") pod \"a1db8415-be58-430e-b6b8-2babaabb6396\" (UID: \"a1db8415-be58-430e-b6b8-2babaabb6396\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.504409 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a1db8415-be58-430e-b6b8-2babaabb6396" (UID: "a1db8415-be58-430e-b6b8-2babaabb6396"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.504482 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-config" (OuterVolumeSpecName: "config") pod "a1db8415-be58-430e-b6b8-2babaabb6396" (UID: "a1db8415-be58-430e-b6b8-2babaabb6396"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.505575 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-client-ca" (OuterVolumeSpecName: "client-ca") pod "a1db8415-be58-430e-b6b8-2babaabb6396" (UID: "a1db8415-be58-430e-b6b8-2babaabb6396"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.510934 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1db8415-be58-430e-b6b8-2babaabb6396-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a1db8415-be58-430e-b6b8-2babaabb6396" (UID: "a1db8415-be58-430e-b6b8-2babaabb6396"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.511209 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1db8415-be58-430e-b6b8-2babaabb6396-kube-api-access-mksz6" (OuterVolumeSpecName: "kube-api-access-mksz6") pod "a1db8415-be58-430e-b6b8-2babaabb6396" (UID: "a1db8415-be58-430e-b6b8-2babaabb6396"). InnerVolumeSpecName "kube-api-access-mksz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.604489 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwdlk\" (UniqueName: \"kubernetes.io/projected/f92b2056-0130-42e2-b4f2-4c688706b379-kube-api-access-mwdlk\") pod \"f92b2056-0130-42e2-b4f2-4c688706b379\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.604573 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f92b2056-0130-42e2-b4f2-4c688706b379-serving-cert\") pod \"f92b2056-0130-42e2-b4f2-4c688706b379\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.604661 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-config\") pod \"f92b2056-0130-42e2-b4f2-4c688706b379\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.604704 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-client-ca\") pod \"f92b2056-0130-42e2-b4f2-4c688706b379\" (UID: \"f92b2056-0130-42e2-b4f2-4c688706b379\") " Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.605483 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-client-ca" (OuterVolumeSpecName: "client-ca") pod "f92b2056-0130-42e2-b4f2-4c688706b379" (UID: "f92b2056-0130-42e2-b4f2-4c688706b379"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.605607 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-config" (OuterVolumeSpecName: "config") pod "f92b2056-0130-42e2-b4f2-4c688706b379" (UID: "f92b2056-0130-42e2-b4f2-4c688706b379"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.605971 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.606002 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mksz6\" (UniqueName: \"kubernetes.io/projected/a1db8415-be58-430e-b6b8-2babaabb6396-kube-api-access-mksz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.606022 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1db8415-be58-430e-b6b8-2babaabb6396-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.606039 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.606055 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1db8415-be58-430e-b6b8-2babaabb6396-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.606071 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.606086 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f92b2056-0130-42e2-b4f2-4c688706b379-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.607987 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92b2056-0130-42e2-b4f2-4c688706b379-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f92b2056-0130-42e2-b4f2-4c688706b379" (UID: "f92b2056-0130-42e2-b4f2-4c688706b379"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.608093 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92b2056-0130-42e2-b4f2-4c688706b379-kube-api-access-mwdlk" (OuterVolumeSpecName: "kube-api-access-mwdlk") pod "f92b2056-0130-42e2-b4f2-4c688706b379" (UID: "f92b2056-0130-42e2-b4f2-4c688706b379"). InnerVolumeSpecName "kube-api-access-mwdlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.699365 4675 generic.go:334] "Generic (PLEG): container finished" podID="a1db8415-be58-430e-b6b8-2babaabb6396" containerID="8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d" exitCode=0 Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.699419 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" event={"ID":"a1db8415-be58-430e-b6b8-2babaabb6396","Type":"ContainerDied","Data":"8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d"} Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.699446 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" event={"ID":"a1db8415-be58-430e-b6b8-2babaabb6396","Type":"ContainerDied","Data":"d1a64f3544927f30e6828d72e0f782853b0abbbddeca81bb0a73ea1663137706"} Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.699465 4675 scope.go:117] "RemoveContainer" containerID="8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.699568 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bf785689-chrg4" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.704002 4675 generic.go:334] "Generic (PLEG): container finished" podID="f92b2056-0130-42e2-b4f2-4c688706b379" containerID="732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5" exitCode=0 Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.704099 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" event={"ID":"f92b2056-0130-42e2-b4f2-4c688706b379","Type":"ContainerDied","Data":"732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5"} Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.704108 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.704124 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-899959959-kjx5x" event={"ID":"f92b2056-0130-42e2-b4f2-4c688706b379","Type":"ContainerDied","Data":"0234f90cc51a158d14fa161f38d14135c91dbfcf30b802e6444d31dec4b3d794"} Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.707737 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwdlk\" (UniqueName: \"kubernetes.io/projected/f92b2056-0130-42e2-b4f2-4c688706b379-kube-api-access-mwdlk\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.707810 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f92b2056-0130-42e2-b4f2-4c688706b379-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.726989 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bf785689-chrg4"] Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.728841 4675 scope.go:117] "RemoveContainer" containerID="8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d" Mar 20 16:06:20 crc kubenswrapper[4675]: E0320 16:06:20.729318 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d\": container with ID starting with 8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d not found: ID does not exist" containerID="8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.729370 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d"} err="failed to get container status \"8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d\": rpc error: code = NotFound desc = could not find container \"8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d\": container with ID starting with 8da967cc22a93310347216eab9a0bc137c505a97a516ce87b17731b7e335711d not found: ID does not exist" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.729391 4675 scope.go:117] "RemoveContainer" containerID="732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.732875 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57bf785689-chrg4"] Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.743378 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-899959959-kjx5x"] Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.749712 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-899959959-kjx5x"] Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.757689 4675 scope.go:117] "RemoveContainer" containerID="732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5" Mar 20 16:06:20 crc kubenswrapper[4675]: E0320 16:06:20.758226 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5\": container with ID starting with 732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5 not found: ID does not exist" containerID="732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5" Mar 20 16:06:20 crc kubenswrapper[4675]: I0320 16:06:20.758297 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5"} err="failed to get container status \"732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5\": rpc error: code = NotFound desc = could not find container \"732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5\": container with ID starting with 732953f4bf7e7ce7e1b95c2cba8279029bc0026ff22162f32164121caaaacca5 not found: ID does not exist" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.146557 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g"] Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.149321 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1db8415-be58-430e-b6b8-2babaabb6396" containerName="controller-manager" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.149420 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1db8415-be58-430e-b6b8-2babaabb6396" containerName="controller-manager" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.149502 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="extract-utilities" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.149576 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="extract-utilities" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.149657 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="extract-content" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.149723 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="extract-content" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.149853 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="extract-content" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.149922 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="extract-content" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.149980 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="extract-utilities" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150037 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="extract-utilities" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.150097 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="registry-server" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150150 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="registry-server" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.150208 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6" containerName="oc" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150318 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6" containerName="oc" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.150378 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92b2056-0130-42e2-b4f2-4c688706b379" containerName="route-controller-manager" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150444 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92b2056-0130-42e2-b4f2-4c688706b379" containerName="route-controller-manager" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.150540 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="registry-server" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150617 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="registry-server" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150876 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1db8415-be58-430e-b6b8-2babaabb6396" containerName="controller-manager" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.150985 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92b2056-0130-42e2-b4f2-4c688706b379" containerName="route-controller-manager" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.151068 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85ec396-6d81-4ad2-b269-315df42e61c4" containerName="registry-server" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.151143 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6" containerName="oc" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.151267 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="abca8440-77fa-48b9-a977-9bba2e267728" containerName="registry-server" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.152154 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.154079 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.154285 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bf8f8db68-b4nls"] Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.155340 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.160571 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.160736 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.160796 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.161134 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.161173 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.161223 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.161282 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g"] Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.161542 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.161946 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.162106 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.162140 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.162738 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.168471 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.176259 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bf8f8db68-b4nls"] Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214424 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-config\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214504 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7m8\" (UniqueName: \"kubernetes.io/projected/9941bbbe-b71d-4640-b877-2717a237f08b-kube-api-access-pj7m8\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214605 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-proxy-ca-bundles\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214647 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-client-ca\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214666 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpqb5\" (UniqueName: \"kubernetes.io/projected/1ece01cb-c99a-453f-b656-b1da5dd52e6e-kube-api-access-jpqb5\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214754 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9941bbbe-b71d-4640-b877-2717a237f08b-serving-cert\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214843 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9941bbbe-b71d-4640-b877-2717a237f08b-client-ca\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.214979 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ece01cb-c99a-453f-b656-b1da5dd52e6e-serving-cert\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.215012 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9941bbbe-b71d-4640-b877-2717a237f08b-config\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.317060 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-config\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.317179 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7m8\" (UniqueName: \"kubernetes.io/projected/9941bbbe-b71d-4640-b877-2717a237f08b-kube-api-access-pj7m8\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318073 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-proxy-ca-bundles\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318214 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-client-ca\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318275 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpqb5\" (UniqueName: \"kubernetes.io/projected/1ece01cb-c99a-453f-b656-b1da5dd52e6e-kube-api-access-jpqb5\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318390 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9941bbbe-b71d-4640-b877-2717a237f08b-serving-cert\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318454 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9941bbbe-b71d-4640-b877-2717a237f08b-client-ca\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318542 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ece01cb-c99a-453f-b656-b1da5dd52e6e-serving-cert\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318608 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9941bbbe-b71d-4640-b877-2717a237f08b-config\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.318853 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-proxy-ca-bundles\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.319374 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-client-ca\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.320422 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ece01cb-c99a-453f-b656-b1da5dd52e6e-config\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.321234 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9941bbbe-b71d-4640-b877-2717a237f08b-client-ca\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.322170 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9941bbbe-b71d-4640-b877-2717a237f08b-config\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.330100 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9941bbbe-b71d-4640-b877-2717a237f08b-serving-cert\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.330677 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ece01cb-c99a-453f-b656-b1da5dd52e6e-serving-cert\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.346057 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpqb5\" (UniqueName: \"kubernetes.io/projected/1ece01cb-c99a-453f-b656-b1da5dd52e6e-kube-api-access-jpqb5\") pod \"controller-manager-7bf8f8db68-b4nls\" (UID: \"1ece01cb-c99a-453f-b656-b1da5dd52e6e\") " pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.352185 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7m8\" (UniqueName: \"kubernetes.io/projected/9941bbbe-b71d-4640-b877-2717a237f08b-kube-api-access-pj7m8\") pod \"route-controller-manager-6cb67f57b4-pdg8g\" (UID: \"9941bbbe-b71d-4640-b877-2717a237f08b\") " pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.493724 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.508830 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.892113 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bf8f8db68-b4nls"] Mar 20 16:06:21 crc kubenswrapper[4675]: W0320 16:06:21.898906 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ece01cb_c99a_453f_b656_b1da5dd52e6e.slice/crio-b60421261c4a6b4743045634f85b399a03daa289d77100fce3a7ce08612514c4 WatchSource:0}: Error finding container b60421261c4a6b4743045634f85b399a03daa289d77100fce3a7ce08612514c4: Status 404 returned error can't find the container with id b60421261c4a6b4743045634f85b399a03daa289d77100fce3a7ce08612514c4 Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.942224 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g"] Mar 20 16:06:21 crc kubenswrapper[4675]: W0320 16:06:21.945797 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9941bbbe_b71d_4640_b877_2717a237f08b.slice/crio-29f201ddcabedc9ab670eb0ff5887d311c652e42c732ee61ecc71a21a51c322b WatchSource:0}: Error finding container 29f201ddcabedc9ab670eb0ff5887d311c652e42c732ee61ecc71a21a51c322b: Status 404 returned error can't find the container with id 29f201ddcabedc9ab670eb0ff5887d311c652e42c732ee61ecc71a21a51c322b Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.979303 4675 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.980484 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982150 4675 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982541 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc" gracePeriod=15 Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982578 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd" gracePeriod=15 Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982629 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807" gracePeriod=15 Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982671 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769" gracePeriod=15 Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982748 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093" gracePeriod=15 Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.982907 4675 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983054 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983071 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983085 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983093 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983106 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983114 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983129 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983138 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983147 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983155 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983168 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983176 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983187 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983195 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983205 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983213 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983223 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983231 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983337 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983347 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983358 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983373 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983383 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983391 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983402 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983411 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: E0320 16:06:21.983516 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983526 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:21 crc kubenswrapper[4675]: I0320 16:06:21.983643 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.027498 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.027604 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.027656 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.027701 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.027823 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.027892 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.028040 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.028098 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: E0320 16:06:22.042088 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-7bf8f8db68-b4nls.189e985232fa9846 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-7bf8f8db68-b4nls,UID:1ece01cb-c99a-453f-b656-b1da5dd52e6e,APIVersion:v1,ResourceVersion:29905,FieldPath:spec.containers{controller-manager},},Reason:Started,Message:Started container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:06:22.040578118 +0000 UTC m=+302.074207655,LastTimestamp:2026-03-20 16:06:22.040578118 +0000 UTC m=+302.074207655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:06:22 crc kubenswrapper[4675]: E0320 16:06:22.075806 4675 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129177 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129231 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129260 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129280 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129299 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129314 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129332 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129356 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129421 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129459 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129480 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129498 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129520 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129527 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129545 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.129631 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.377012 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:22 crc kubenswrapper[4675]: W0320 16:06:22.413177 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a96ad1f3e27778b5f3d541f8e579da9be5097eccf8a1dc369a995300f369aa74 WatchSource:0}: Error finding container a96ad1f3e27778b5f3d541f8e579da9be5097eccf8a1dc369a995300f369aa74: Status 404 returned error can't find the container with id a96ad1f3e27778b5f3d541f8e579da9be5097eccf8a1dc369a995300f369aa74 Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.680855 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1db8415-be58-430e-b6b8-2babaabb6396" path="/var/lib/kubelet/pods/a1db8415-be58-430e-b6b8-2babaabb6396/volumes" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.681498 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92b2056-0130-42e2-b4f2-4c688706b379" path="/var/lib/kubelet/pods/f92b2056-0130-42e2-b4f2-4c688706b379/volumes" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.720019 4675 generic.go:334] "Generic (PLEG): container finished" podID="f0fa5466-5ff3-4b74-a932-5ee34be11884" containerID="db035d95a49d25cd35a989ac1d5f1cbfa8c875dc577a370a38ade2f28d556d66" exitCode=0 Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.720087 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f0fa5466-5ff3-4b74-a932-5ee34be11884","Type":"ContainerDied","Data":"db035d95a49d25cd35a989ac1d5f1cbfa8c875dc577a370a38ade2f28d556d66"} Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.720662 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.722647 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.723887 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.724798 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd" exitCode=0 Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.724819 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769" exitCode=0 Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.724828 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807" exitCode=0 Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.724836 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093" exitCode=2 Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.724887 4675 scope.go:117] "RemoveContainer" containerID="eecdfce7f4425a4d6c0ab6467a22e83a4c5df34074defdf1919eadc755257edd" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.726487 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" event={"ID":"1ece01cb-c99a-453f-b656-b1da5dd52e6e","Type":"ContainerStarted","Data":"ff8da52529964fb1c958842fc85cc9d9699c0d39baa05eb4d94e5b1794b0be0e"} Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.726520 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" event={"ID":"1ece01cb-c99a-453f-b656-b1da5dd52e6e","Type":"ContainerStarted","Data":"b60421261c4a6b4743045634f85b399a03daa289d77100fce3a7ce08612514c4"} Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.727071 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.727348 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.727644 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.729140 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" event={"ID":"9941bbbe-b71d-4640-b877-2717a237f08b","Type":"ContainerStarted","Data":"a41db23d74e760a4c8d1d9c3f0f0040c309ad5ebcc7251d5787ccb40107320bd"} Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.729171 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" event={"ID":"9941bbbe-b71d-4640-b877-2717a237f08b","Type":"ContainerStarted","Data":"29f201ddcabedc9ab670eb0ff5887d311c652e42c732ee61ecc71a21a51c322b"} Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.729339 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.729632 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.729871 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.730080 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.731082 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a96ad1f3e27778b5f3d541f8e579da9be5097eccf8a1dc369a995300f369aa74"} Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.732801 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.733071 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.733831 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:22 crc kubenswrapper[4675]: I0320 16:06:22.734437 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:23 crc kubenswrapper[4675]: E0320 16:06:23.248137 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-7bf8f8db68-b4nls.189e985232fa9846 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-7bf8f8db68-b4nls,UID:1ece01cb-c99a-453f-b656-b1da5dd52e6e,APIVersion:v1,ResourceVersion:29905,FieldPath:spec.containers{controller-manager},},Reason:Started,Message:Started container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:06:22.040578118 +0000 UTC m=+302.074207655,LastTimestamp:2026-03-20 16:06:22.040578118 +0000 UTC m=+302.074207655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.729682 4675 patch_prober.go:28] interesting pod/route-controller-manager-6cb67f57b4-pdg8g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.729792 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.752378 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.757208 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452"} Mar 20 16:06:23 crc kubenswrapper[4675]: E0320 16:06:23.758216 4675 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.759040 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.759506 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:23 crc kubenswrapper[4675]: I0320 16:06:23.760098 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.204234 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.205250 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.206003 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.206564 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294050 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-kubelet-dir\") pod \"f0fa5466-5ff3-4b74-a932-5ee34be11884\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294131 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-var-lock\") pod \"f0fa5466-5ff3-4b74-a932-5ee34be11884\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294162 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0fa5466-5ff3-4b74-a932-5ee34be11884-kube-api-access\") pod \"f0fa5466-5ff3-4b74-a932-5ee34be11884\" (UID: \"f0fa5466-5ff3-4b74-a932-5ee34be11884\") " Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294216 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-var-lock" (OuterVolumeSpecName: "var-lock") pod "f0fa5466-5ff3-4b74-a932-5ee34be11884" (UID: "f0fa5466-5ff3-4b74-a932-5ee34be11884"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294216 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f0fa5466-5ff3-4b74-a932-5ee34be11884" (UID: "f0fa5466-5ff3-4b74-a932-5ee34be11884"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294420 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.294437 4675 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f0fa5466-5ff3-4b74-a932-5ee34be11884-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.302871 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fa5466-5ff3-4b74-a932-5ee34be11884-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f0fa5466-5ff3-4b74-a932-5ee34be11884" (UID: "f0fa5466-5ff3-4b74-a932-5ee34be11884"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.354110 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.355480 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.357008 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.357643 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.358337 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.358936 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.395933 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0fa5466-5ff3-4b74-a932-5ee34be11884-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.497462 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.497511 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.497526 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.497624 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.497632 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.497681 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.498210 4675 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.498262 4675 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.498288 4675 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.683199 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.758284 4675 patch_prober.go:28] interesting pod/route-controller-manager-6cb67f57b4-pdg8g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.758357 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.765980 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.766589 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc" exitCode=0 Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.766662 4675 scope.go:117] "RemoveContainer" containerID="46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.766686 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.767420 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.767758 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.768655 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.769142 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.769222 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f0fa5466-5ff3-4b74-a932-5ee34be11884","Type":"ContainerDied","Data":"88b25899424e93ad579502fdda25175805e2062c4114d74ad10181270dfe359c"} Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.769256 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b25899424e93ad579502fdda25175805e2062c4114d74ad10181270dfe359c" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.769333 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.769451 4675 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.771640 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.772237 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.772622 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.772976 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.778368 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.778647 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.779021 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.779322 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.787369 4675 scope.go:117] "RemoveContainer" containerID="b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.805337 4675 scope.go:117] "RemoveContainer" containerID="cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.824396 4675 scope.go:117] "RemoveContainer" containerID="e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.840664 4675 scope.go:117] "RemoveContainer" containerID="44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.855948 4675 scope.go:117] "RemoveContainer" containerID="2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.886524 4675 scope.go:117] "RemoveContainer" containerID="46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.887192 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\": container with ID starting with 46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd not found: ID does not exist" containerID="46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.887222 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd"} err="failed to get container status \"46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\": rpc error: code = NotFound desc = could not find container \"46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd\": container with ID starting with 46cf4ed2737e28d9e640beece31f41d3cf8e50f21f8e416ff086171a98e80bdd not found: ID does not exist" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.887239 4675 scope.go:117] "RemoveContainer" containerID="b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.887648 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\": container with ID starting with b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769 not found: ID does not exist" containerID="b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.887694 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769"} err="failed to get container status \"b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\": rpc error: code = NotFound desc = could not find container \"b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769\": container with ID starting with b65034701dcb7b8adf7f682f036604f1f53ee777f16d63fc855cd13f7359d769 not found: ID does not exist" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.887721 4675 scope.go:117] "RemoveContainer" containerID="cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.888475 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\": container with ID starting with cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807 not found: ID does not exist" containerID="cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.888520 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807"} err="failed to get container status \"cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\": rpc error: code = NotFound desc = could not find container \"cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807\": container with ID starting with cea21f5f832d6c6fc613cbd5b0a0bb01c40b8df1b83e9151808d9f9bb13a7807 not found: ID does not exist" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.888552 4675 scope.go:117] "RemoveContainer" containerID="e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.888919 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\": container with ID starting with e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093 not found: ID does not exist" containerID="e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.888960 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093"} err="failed to get container status \"e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\": rpc error: code = NotFound desc = could not find container \"e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093\": container with ID starting with e915adc5b3031072d870ddc582202ee9b8b8c2988cc930c8ba4e9f5c27e45093 not found: ID does not exist" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.888974 4675 scope.go:117] "RemoveContainer" containerID="44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.889313 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\": container with ID starting with 44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc not found: ID does not exist" containerID="44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.889350 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc"} err="failed to get container status \"44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\": rpc error: code = NotFound desc = could not find container \"44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc\": container with ID starting with 44808666e5889e001baf1eac5b7fd97d2d0b012b157068a4c7713b55b4ef1edc not found: ID does not exist" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.889369 4675 scope.go:117] "RemoveContainer" containerID="2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82" Mar 20 16:06:24 crc kubenswrapper[4675]: E0320 16:06:24.890998 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\": container with ID starting with 2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82 not found: ID does not exist" containerID="2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82" Mar 20 16:06:24 crc kubenswrapper[4675]: I0320 16:06:24.891054 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82"} err="failed to get container status \"2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\": rpc error: code = NotFound desc = could not find container \"2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82\": container with ID starting with 2482856de5ec7d1479e33ca28eb429570ec624959fe5a03d53d39eb7132cbf82 not found: ID does not exist" Mar 20 16:06:30 crc kubenswrapper[4675]: I0320 16:06:30.676630 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:30 crc kubenswrapper[4675]: I0320 16:06:30.677852 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:30 crc kubenswrapper[4675]: I0320 16:06:30.678220 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.618561 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.619354 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.619958 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.620262 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.620561 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:31 crc kubenswrapper[4675]: I0320 16:06:31.620602 4675 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.620924 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="200ms" Mar 20 16:06:31 crc kubenswrapper[4675]: E0320 16:06:31.823043 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="400ms" Mar 20 16:06:32 crc kubenswrapper[4675]: E0320 16:06:32.224642 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="800ms" Mar 20 16:06:32 crc kubenswrapper[4675]: I0320 16:06:32.512449 4675 patch_prober.go:28] interesting pod/route-controller-manager-6cb67f57b4-pdg8g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:06:32 crc kubenswrapper[4675]: I0320 16:06:32.512535 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:06:33 crc kubenswrapper[4675]: E0320 16:06:33.025029 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="1.6s" Mar 20 16:06:33 crc kubenswrapper[4675]: E0320 16:06:33.403608 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-7bf8f8db68-b4nls.189e985232fa9846 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-7bf8f8db68-b4nls,UID:1ece01cb-c99a-453f-b656-b1da5dd52e6e,APIVersion:v1,ResourceVersion:29905,FieldPath:spec.containers{controller-manager},},Reason:Started,Message:Started container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:06:22.040578118 +0000 UTC m=+302.074207655,LastTimestamp:2026-03-20 16:06:22.040578118 +0000 UTC m=+302.074207655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:06:34 crc kubenswrapper[4675]: E0320 16:06:34.626554 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="3.2s" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.673160 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.675110 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.675841 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.676319 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.698400 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.698446 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:34 crc kubenswrapper[4675]: E0320 16:06:34.699066 4675 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.699832 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:34 crc kubenswrapper[4675]: W0320 16:06:34.742513 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7fdd420fdb433d2b1d5e8ff7b4b2d074179af8fe1efd6c82c816a1952548ef67 WatchSource:0}: Error finding container 7fdd420fdb433d2b1d5e8ff7b4b2d074179af8fe1efd6c82c816a1952548ef67: Status 404 returned error can't find the container with id 7fdd420fdb433d2b1d5e8ff7b4b2d074179af8fe1efd6c82c816a1952548ef67 Mar 20 16:06:34 crc kubenswrapper[4675]: I0320 16:06:34.831216 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7fdd420fdb433d2b1d5e8ff7b4b2d074179af8fe1efd6c82c816a1952548ef67"} Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.844058 4675 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="faad3041c752afac9d9fb7b4c1ee4cf6d8939da1aff64c04a4891d3b76462367" exitCode=0 Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.844377 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.844404 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.844129 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"faad3041c752afac9d9fb7b4c1ee4cf6d8939da1aff64c04a4891d3b76462367"} Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.844980 4675 status_manager.go:851] "Failed to get status for pod" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6cb67f57b4-pdg8g\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:35 crc kubenswrapper[4675]: E0320 16:06:35.845164 4675 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.845479 4675 status_manager.go:851] "Failed to get status for pod" podUID="1ece01cb-c99a-453f-b656-b1da5dd52e6e" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7bf8f8db68-b4nls\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:35 crc kubenswrapper[4675]: I0320 16:06:35.846068 4675 status_manager.go:851] "Failed to get status for pod" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.857007 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.858658 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.858702 4675 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924" exitCode=1 Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.858756 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924"} Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.859211 4675 scope.go:117] "RemoveContainer" containerID="a4f525aeed4d2659413467b67350b583e3e6ddfde6c9cf705f68874777f58924" Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.862641 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ce6e920bf9895eb700008d7f0f0805306527d3b7b1c02fa3a849b0a5d130020"} Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.862685 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25e1b8039e4e404843a37283d637a32cc7be3ae79ffe551e09b34b0e8b0d9241"} Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.862699 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f79be3be764b1c95c07e063fbb6c30e00015a68d2621f0825ecbcc778679a842"} Mar 20 16:06:36 crc kubenswrapper[4675]: I0320 16:06:36.862707 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6dd88ce9ae0e5eea227d396ba20d6c9a608e665480e13d01967e1c02df1a6104"} Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.692254 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" podUID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" containerName="oauth-openshift" containerID="cri-o://5d22243f7c7b9a95adb578c6134b45e34351d4b8ebe7efb6b284d74cf20faaa0" gracePeriod=15 Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.869636 4675 generic.go:334] "Generic (PLEG): container finished" podID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" containerID="5d22243f7c7b9a95adb578c6134b45e34351d4b8ebe7efb6b284d74cf20faaa0" exitCode=0 Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.869762 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" event={"ID":"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec","Type":"ContainerDied","Data":"5d22243f7c7b9a95adb578c6134b45e34351d4b8ebe7efb6b284d74cf20faaa0"} Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.873785 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"51a5462b90e59b6ca65be555d00cbd99cdc2f922223feb9a2e4377cf674737f3"} Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.874018 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.874220 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.874252 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.877400 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.878132 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 16:06:37 crc kubenswrapper[4675]: I0320 16:06:37.878210 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"14d0ab1d6b285aaebd6cf81aed5425dd8e0814c392ae951249ad3caba95f18c1"} Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.472496 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.566933 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-service-ca\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.566972 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-cliconfig\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.566999 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-login\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567028 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-idp-0-file-data\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567048 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-session\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567075 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-trusted-ca-bundle\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567099 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-error\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567117 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-serving-cert\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567146 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-provider-selection\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567164 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-dir\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567183 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-router-certs\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567205 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-policies\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567221 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-ocp-branding-template\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.567257 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2xtt\" (UniqueName: \"kubernetes.io/projected/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-kube-api-access-v2xtt\") pod \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\" (UID: \"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec\") " Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.568096 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.568231 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.568867 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.568914 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.569333 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.576316 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.576566 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-kube-api-access-v2xtt" (OuterVolumeSpecName: "kube-api-access-v2xtt") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "kube-api-access-v2xtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.576551 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.577320 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.580639 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.581422 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.584166 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.584334 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.584696 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" (UID: "444674eb-f24e-4bdc-ba4a-ca2c9ac876ec"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668531 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668589 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668609 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668628 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668651 4675 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668670 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668689 4675 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668708 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668726 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2xtt\" (UniqueName: \"kubernetes.io/projected/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-kube-api-access-v2xtt\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668742 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668794 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668813 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668830 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.668847 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.886081 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" event={"ID":"444674eb-f24e-4bdc-ba4a-ca2c9ac876ec","Type":"ContainerDied","Data":"c5847fc2a0fd333c3e1b4e313943e8a44bdbfc72b0d60949b5755cd36f8f62f1"} Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.886163 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rpvlc" Mar 20 16:06:38 crc kubenswrapper[4675]: I0320 16:06:38.886168 4675 scope.go:117] "RemoveContainer" containerID="5d22243f7c7b9a95adb578c6134b45e34351d4b8ebe7efb6b284d74cf20faaa0" Mar 20 16:06:39 crc kubenswrapper[4675]: I0320 16:06:39.701251 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:39 crc kubenswrapper[4675]: I0320 16:06:39.701325 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:39 crc kubenswrapper[4675]: I0320 16:06:39.711897 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.494753 4675 patch_prober.go:28] interesting pod/route-controller-manager-6cb67f57b4-pdg8g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.495122 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.885070 4675 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.911446 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.911487 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.916405 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:42 crc kubenswrapper[4675]: I0320 16:06:42.918635 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c275ca73-c3fc-43c1-8918-b4625b7c9016" Mar 20 16:06:43 crc kubenswrapper[4675]: E0320 16:06:43.117040 4675 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Mar 20 16:06:43 crc kubenswrapper[4675]: E0320 16:06:43.511035 4675 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Mar 20 16:06:43 crc kubenswrapper[4675]: I0320 16:06:43.921877 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:43 crc kubenswrapper[4675]: I0320 16:06:43.921925 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:06:45 crc kubenswrapper[4675]: I0320 16:06:45.895617 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:06:45 crc kubenswrapper[4675]: I0320 16:06:45.895860 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 16:06:45 crc kubenswrapper[4675]: I0320 16:06:45.895982 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 16:06:46 crc kubenswrapper[4675]: I0320 16:06:46.390325 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:06:50 crc kubenswrapper[4675]: I0320 16:06:50.702244 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c275ca73-c3fc-43c1-8918-b4625b7c9016" Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.350460 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.494611 4675 patch_prober.go:28] interesting pod/route-controller-manager-6cb67f57b4-pdg8g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.495145 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" podUID="9941bbbe-b71d-4640-b877-2717a237f08b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.985452 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6cb67f57b4-pdg8g_9941bbbe-b71d-4640-b877-2717a237f08b/route-controller-manager/0.log" Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.985519 4675 generic.go:334] "Generic (PLEG): container finished" podID="9941bbbe-b71d-4640-b877-2717a237f08b" containerID="a41db23d74e760a4c8d1d9c3f0f0040c309ad5ebcc7251d5787ccb40107320bd" exitCode=255 Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.985553 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" event={"ID":"9941bbbe-b71d-4640-b877-2717a237f08b","Type":"ContainerDied","Data":"a41db23d74e760a4c8d1d9c3f0f0040c309ad5ebcc7251d5787ccb40107320bd"} Mar 20 16:06:52 crc kubenswrapper[4675]: I0320 16:06:52.987353 4675 scope.go:117] "RemoveContainer" containerID="a41db23d74e760a4c8d1d9c3f0f0040c309ad5ebcc7251d5787ccb40107320bd" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.324425 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.653751 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.656280 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.799817 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.984835 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.994898 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6cb67f57b4-pdg8g_9941bbbe-b71d-4640-b877-2717a237f08b/route-controller-manager/0.log" Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.994966 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" event={"ID":"9941bbbe-b71d-4640-b877-2717a237f08b","Type":"ContainerStarted","Data":"2d1073b46dbe4bb5d478a5084a899adabf68ffecb448c27422ba02d5d2779409"} Mar 20 16:06:53 crc kubenswrapper[4675]: I0320 16:06:53.995325 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.110469 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.138742 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.211256 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.374849 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.519117 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.541444 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.550300 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" Mar 20 16:06:54 crc kubenswrapper[4675]: I0320 16:06:54.552036 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.119856 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.203301 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.207849 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.270698 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.629897 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.670745 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.896396 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 16:06:55 crc kubenswrapper[4675]: I0320 16:06:55.896474 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.020013 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.074081 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.140678 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.171015 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.635233 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.670230 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.722633 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.737422 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 16:06:56 crc kubenswrapper[4675]: I0320 16:06:56.855494 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.008964 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.053552 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.108851 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.137456 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.138754 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.157229 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.204191 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.216633 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.238738 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.362160 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.366040 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.427055 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.470592 4675 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.483703 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.493930 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.564323 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.583871 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.690930 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.737234 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.779656 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.860402 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 16:06:57 crc kubenswrapper[4675]: I0320 16:06:57.869245 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.051322 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.187617 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.234594 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.334614 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.434490 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.451311 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.453181 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.486598 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.486712 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.493724 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.560230 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.571895 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.631149 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.746346 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.795288 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.956536 4675 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.969558 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 16:06:58 crc kubenswrapper[4675]: I0320 16:06:58.976669 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.131796 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.268278 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.323784 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.476559 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.497477 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.540652 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.589253 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.681570 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.692668 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.766208 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.842854 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.850288 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.912924 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.919969 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 16:06:59 crc kubenswrapper[4675]: I0320 16:06:59.922679 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.162371 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.203921 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.282071 4675 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.325126 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.421374 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.513451 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.518689 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.532101 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.559033 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.631605 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.736398 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.838856 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.855737 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 16:07:00 crc kubenswrapper[4675]: I0320 16:07:00.907888 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.008512 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.040559 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.084928 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.099148 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.130312 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.142264 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.158181 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.159838 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.171567 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.241625 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.275101 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.387629 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.392421 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.530098 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.537749 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.725457 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.750049 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.845205 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.846241 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.877722 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 16:07:01 crc kubenswrapper[4675]: I0320 16:07:01.990663 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.073137 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.083603 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.105491 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.214790 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.234817 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.238222 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.269761 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.271953 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.320227 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.519858 4675 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.604284 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.604519 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.616148 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.658382 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.759537 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.889797 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.896226 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 16:07:02 crc kubenswrapper[4675]: I0320 16:07:02.946372 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.072991 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.095936 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.133796 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.272385 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.288066 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.323309 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.577737 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.677491 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.804810 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.807456 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.820699 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.836978 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.893537 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 16:07:03 crc kubenswrapper[4675]: I0320 16:07:03.902137 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.055259 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.065198 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.092930 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.143752 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.187254 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.207870 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.214203 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.219104 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.266962 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.345387 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.353904 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.454958 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.527061 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.569847 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.675092 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.677836 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.725970 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.737165 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.745739 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.791112 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.845454 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.855001 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 16:07:04 crc kubenswrapper[4675]: I0320 16:07:04.922034 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.040502 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.139415 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.298164 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.331583 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.338097 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.504144 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.616092 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.761106 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.857252 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.903094 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.910205 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.920178 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.920281 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.951465 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:07:05 crc kubenswrapper[4675]: I0320 16:07:05.976900 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.001805 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.003988 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.088720 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.121816 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.283396 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.343159 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.385121 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.400723 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.565057 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.655313 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.751322 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.757863 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.764745 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.804713 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.812692 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.910816 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.955227 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.971562 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 16:07:06 crc kubenswrapper[4675]: I0320 16:07:06.977832 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.004052 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.008682 4675 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.009268 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cb67f57b4-pdg8g" podStartSLOduration=47.009246181 podStartE2EDuration="47.009246181s" podCreationTimestamp="2026-03-20 16:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:42.516956564 +0000 UTC m=+322.550586111" watchObservedRunningTime="2026-03-20 16:07:07.009246181 +0000 UTC m=+347.042875748" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.010853 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bf8f8db68-b4nls" podStartSLOduration=48.010838877 podStartE2EDuration="48.010838877s" podCreationTimestamp="2026-03-20 16:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:42.540866294 +0000 UTC m=+322.574495851" watchObservedRunningTime="2026-03-20 16:07:07.010838877 +0000 UTC m=+347.044468454" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.016236 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-rpvlc"] Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.016332 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-tlkkl","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:07:07 crc kubenswrapper[4675]: E0320 16:07:07.016581 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" containerName="oauth-openshift" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.016609 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" containerName="oauth-openshift" Mar 20 16:07:07 crc kubenswrapper[4675]: E0320 16:07:07.016643 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" containerName="installer" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.016656 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" containerName="installer" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.016846 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fa5466-5ff3-4b74-a932-5ee34be11884" containerName="installer" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.016880 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" containerName="oauth-openshift" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.017152 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.017278 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e1f82c7-b739-4c4a-a633-26b6f2b68da8" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.017482 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.020951 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.021032 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.021998 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.022037 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.022055 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.022143 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.022220 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.024219 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.024303 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.024467 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.024830 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.025425 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.025828 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.034614 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.036539 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.044805 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.070691 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.070674054 podStartE2EDuration="25.070674054s" podCreationTimestamp="2026-03-20 16:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:07.067051089 +0000 UTC m=+347.100680646" watchObservedRunningTime="2026-03-20 16:07:07.070674054 +0000 UTC m=+347.104303591" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143041 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143112 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143172 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-audit-policies\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143222 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143272 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143526 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64crx\" (UniqueName: \"kubernetes.io/projected/e5c66eff-8fed-4e23-ae91-78e52f275c8e-kube-api-access-64crx\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143615 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143670 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143717 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143812 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143881 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.143962 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5c66eff-8fed-4e23-ae91-78e52f275c8e-audit-dir\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.144028 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.144071 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.223081 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245485 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64crx\" (UniqueName: \"kubernetes.io/projected/e5c66eff-8fed-4e23-ae91-78e52f275c8e-kube-api-access-64crx\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245564 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245618 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245644 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245693 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245746 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245825 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5c66eff-8fed-4e23-ae91-78e52f275c8e-audit-dir\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245862 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245911 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245945 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.245992 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.246079 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-audit-policies\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.246118 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.246149 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.246988 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.248141 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5c66eff-8fed-4e23-ae91-78e52f275c8e-audit-dir\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.249285 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.249965 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-audit-policies\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.252669 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.254069 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.256521 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.256870 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.257581 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.258871 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.259101 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.259139 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.259591 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5c66eff-8fed-4e23-ae91-78e52f275c8e-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.272182 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64crx\" (UniqueName: \"kubernetes.io/projected/e5c66eff-8fed-4e23-ae91-78e52f275c8e-kube-api-access-64crx\") pod \"oauth-openshift-748578cd96-tlkkl\" (UID: \"e5c66eff-8fed-4e23-ae91-78e52f275c8e\") " pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.282969 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.343302 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.350970 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.437125 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.539352 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.557441 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.570338 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.617534 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.809761 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-tlkkl"] Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.815824 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 16:07:07 crc kubenswrapper[4675]: I0320 16:07:07.836238 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.039942 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.061354 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.064698 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.084422 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" event={"ID":"e5c66eff-8fed-4e23-ae91-78e52f275c8e","Type":"ContainerStarted","Data":"11896b4b922079552415140a94b2ab8c89f69c3d99e63bc5d5b8d1756d14fad3"} Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.447889 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.470593 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.472576 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.604638 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.681862 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444674eb-f24e-4bdc-ba4a-ca2c9ac876ec" path="/var/lib/kubelet/pods/444674eb-f24e-4bdc-ba4a-ca2c9ac876ec/volumes" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.761365 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 16:07:08 crc kubenswrapper[4675]: I0320 16:07:08.813586 4675 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.019453 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.101805 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.107594 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" event={"ID":"e5c66eff-8fed-4e23-ae91-78e52f275c8e","Type":"ContainerStarted","Data":"d7ee19515b3c4e832af787bdf436e00e2f8a78432ff39aa09461189beb35f2fd"} Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.107966 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.117161 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.139860 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-748578cd96-tlkkl" podStartSLOduration=57.1398292 podStartE2EDuration="57.1398292s" podCreationTimestamp="2026-03-20 16:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:09.138237844 +0000 UTC m=+349.171867441" watchObservedRunningTime="2026-03-20 16:07:09.1398292 +0000 UTC m=+349.173458767" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.244690 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.280096 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.286713 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.544845 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.678475 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.698301 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 16:07:09 crc kubenswrapper[4675]: I0320 16:07:09.852842 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 16:07:10 crc kubenswrapper[4675]: I0320 16:07:10.311457 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 16:07:11 crc kubenswrapper[4675]: I0320 16:07:11.056686 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:07:16 crc kubenswrapper[4675]: I0320 16:07:16.466222 4675 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:07:16 crc kubenswrapper[4675]: I0320 16:07:16.467102 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452" gracePeriod=5 Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.070643 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.071353 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.179743 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.179830 4675 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452" exitCode=137 Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.179877 4675 scope.go:117] "RemoveContainer" containerID="7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.179894 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.198210 4675 scope.go:117] "RemoveContainer" containerID="7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452" Mar 20 16:07:22 crc kubenswrapper[4675]: E0320 16:07:22.198679 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452\": container with ID starting with 7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452 not found: ID does not exist" containerID="7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.198729 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452"} err="failed to get container status \"7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452\": rpc error: code = NotFound desc = could not find container \"7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452\": container with ID starting with 7d9d3046c1a2c09ee818dca9fdaba631fae934824d4c4a5865126a876eed3452 not found: ID does not exist" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240209 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240318 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240339 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240384 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240443 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240484 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240527 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240602 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240623 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240819 4675 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240840 4675 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240854 4675 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.240867 4675 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.252069 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.341874 4675 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4675]: I0320 16:07:22.684547 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 16:07:55 crc kubenswrapper[4675]: I0320 16:07:55.952282 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xb69f"] Mar 20 16:07:55 crc kubenswrapper[4675]: E0320 16:07:55.953000 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 16:07:55 crc kubenswrapper[4675]: I0320 16:07:55.953013 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 16:07:55 crc kubenswrapper[4675]: I0320 16:07:55.953110 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 16:07:55 crc kubenswrapper[4675]: I0320 16:07:55.953446 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:55 crc kubenswrapper[4675]: I0320 16:07:55.973925 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xb69f"] Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.107912 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9q8p\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-kube-api-access-m9q8p\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.108358 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-registry-tls\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.108553 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-bound-sa-token\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.108853 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af351c51-a4c2-43b4-bbcf-d1da1c809168-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.109045 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af351c51-a4c2-43b4-bbcf-d1da1c809168-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.109233 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af351c51-a4c2-43b4-bbcf-d1da1c809168-registry-certificates\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.109422 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.109642 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af351c51-a4c2-43b4-bbcf-d1da1c809168-trusted-ca\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.134557 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210798 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af351c51-a4c2-43b4-bbcf-d1da1c809168-trusted-ca\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210878 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9q8p\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-kube-api-access-m9q8p\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210900 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-registry-tls\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210921 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-bound-sa-token\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210964 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af351c51-a4c2-43b4-bbcf-d1da1c809168-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210980 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af351c51-a4c2-43b4-bbcf-d1da1c809168-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.210995 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af351c51-a4c2-43b4-bbcf-d1da1c809168-registry-certificates\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.211653 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af351c51-a4c2-43b4-bbcf-d1da1c809168-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.212139 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af351c51-a4c2-43b4-bbcf-d1da1c809168-trusted-ca\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.212284 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af351c51-a4c2-43b4-bbcf-d1da1c809168-registry-certificates\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.220291 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af351c51-a4c2-43b4-bbcf-d1da1c809168-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.221081 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-registry-tls\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.230976 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9q8p\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-kube-api-access-m9q8p\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.231881 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af351c51-a4c2-43b4-bbcf-d1da1c809168-bound-sa-token\") pod \"image-registry-66df7c8f76-xb69f\" (UID: \"af351c51-a4c2-43b4-bbcf-d1da1c809168\") " pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.311901 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:56 crc kubenswrapper[4675]: I0320 16:07:56.752072 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xb69f"] Mar 20 16:07:57 crc kubenswrapper[4675]: I0320 16:07:57.388610 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" event={"ID":"af351c51-a4c2-43b4-bbcf-d1da1c809168","Type":"ContainerStarted","Data":"d869c24a2d69fc9fe0cdd9908b51cc5cadd87aa7a884017ebf04cda281da30d4"} Mar 20 16:07:57 crc kubenswrapper[4675]: I0320 16:07:57.388664 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" event={"ID":"af351c51-a4c2-43b4-bbcf-d1da1c809168","Type":"ContainerStarted","Data":"4a6b317341b6631ccef4d3db64a8347d0a84a85bbf73dcfd8ca52f9f2615333a"} Mar 20 16:07:57 crc kubenswrapper[4675]: I0320 16:07:57.388815 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:07:57 crc kubenswrapper[4675]: I0320 16:07:57.426292 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" podStartSLOduration=2.426257499 podStartE2EDuration="2.426257499s" podCreationTimestamp="2026-03-20 16:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:57.414948451 +0000 UTC m=+397.448577998" watchObservedRunningTime="2026-03-20 16:07:57.426257499 +0000 UTC m=+397.459887076" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.194790 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567048-2ct4l"] Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.196638 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.201937 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.201973 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.203906 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.212846 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-2ct4l"] Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.386337 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkvn\" (UniqueName: \"kubernetes.io/projected/0e7176cd-8e2e-4a66-81e4-41b8502d8813-kube-api-access-4vkvn\") pod \"auto-csr-approver-29567048-2ct4l\" (UID: \"0e7176cd-8e2e-4a66-81e4-41b8502d8813\") " pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.488321 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkvn\" (UniqueName: \"kubernetes.io/projected/0e7176cd-8e2e-4a66-81e4-41b8502d8813-kube-api-access-4vkvn\") pod \"auto-csr-approver-29567048-2ct4l\" (UID: \"0e7176cd-8e2e-4a66-81e4-41b8502d8813\") " pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.508337 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkvn\" (UniqueName: \"kubernetes.io/projected/0e7176cd-8e2e-4a66-81e4-41b8502d8813-kube-api-access-4vkvn\") pod \"auto-csr-approver-29567048-2ct4l\" (UID: \"0e7176cd-8e2e-4a66-81e4-41b8502d8813\") " pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:00 crc kubenswrapper[4675]: I0320 16:08:00.532223 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:01 crc kubenswrapper[4675]: I0320 16:08:01.012729 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-2ct4l"] Mar 20 16:08:01 crc kubenswrapper[4675]: I0320 16:08:01.415010 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" event={"ID":"0e7176cd-8e2e-4a66-81e4-41b8502d8813","Type":"ContainerStarted","Data":"5facd0b590d32ec1688b8354e35351aa5d87fa4a46867e404c577cec53b73c6e"} Mar 20 16:08:02 crc kubenswrapper[4675]: I0320 16:08:02.422417 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" event={"ID":"0e7176cd-8e2e-4a66-81e4-41b8502d8813","Type":"ContainerStarted","Data":"c11c751a61ba3e0c414d0610c62e3dea1dfe20836050eabe5667db0f7d31844a"} Mar 20 16:08:02 crc kubenswrapper[4675]: I0320 16:08:02.442658 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" podStartSLOduration=1.355883906 podStartE2EDuration="2.442627553s" podCreationTimestamp="2026-03-20 16:08:00 +0000 UTC" firstStartedPulling="2026-03-20 16:08:01.023395154 +0000 UTC m=+401.057024691" lastFinishedPulling="2026-03-20 16:08:02.110138801 +0000 UTC m=+402.143768338" observedRunningTime="2026-03-20 16:08:02.438853343 +0000 UTC m=+402.472482880" watchObservedRunningTime="2026-03-20 16:08:02.442627553 +0000 UTC m=+402.476257120" Mar 20 16:08:03 crc kubenswrapper[4675]: I0320 16:08:03.431270 4675 generic.go:334] "Generic (PLEG): container finished" podID="0e7176cd-8e2e-4a66-81e4-41b8502d8813" containerID="c11c751a61ba3e0c414d0610c62e3dea1dfe20836050eabe5667db0f7d31844a" exitCode=0 Mar 20 16:08:03 crc kubenswrapper[4675]: I0320 16:08:03.431390 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" event={"ID":"0e7176cd-8e2e-4a66-81e4-41b8502d8813","Type":"ContainerDied","Data":"c11c751a61ba3e0c414d0610c62e3dea1dfe20836050eabe5667db0f7d31844a"} Mar 20 16:08:04 crc kubenswrapper[4675]: I0320 16:08:04.424591 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:08:04 crc kubenswrapper[4675]: I0320 16:08:04.425149 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:08:04 crc kubenswrapper[4675]: I0320 16:08:04.752292 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:04 crc kubenswrapper[4675]: I0320 16:08:04.953652 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vkvn\" (UniqueName: \"kubernetes.io/projected/0e7176cd-8e2e-4a66-81e4-41b8502d8813-kube-api-access-4vkvn\") pod \"0e7176cd-8e2e-4a66-81e4-41b8502d8813\" (UID: \"0e7176cd-8e2e-4a66-81e4-41b8502d8813\") " Mar 20 16:08:04 crc kubenswrapper[4675]: I0320 16:08:04.964859 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7176cd-8e2e-4a66-81e4-41b8502d8813-kube-api-access-4vkvn" (OuterVolumeSpecName: "kube-api-access-4vkvn") pod "0e7176cd-8e2e-4a66-81e4-41b8502d8813" (UID: "0e7176cd-8e2e-4a66-81e4-41b8502d8813"). InnerVolumeSpecName "kube-api-access-4vkvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:05 crc kubenswrapper[4675]: I0320 16:08:05.054726 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vkvn\" (UniqueName: \"kubernetes.io/projected/0e7176cd-8e2e-4a66-81e4-41b8502d8813-kube-api-access-4vkvn\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:05 crc kubenswrapper[4675]: I0320 16:08:05.448273 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" event={"ID":"0e7176cd-8e2e-4a66-81e4-41b8502d8813","Type":"ContainerDied","Data":"5facd0b590d32ec1688b8354e35351aa5d87fa4a46867e404c577cec53b73c6e"} Mar 20 16:08:05 crc kubenswrapper[4675]: I0320 16:08:05.448330 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5facd0b590d32ec1688b8354e35351aa5d87fa4a46867e404c577cec53b73c6e" Mar 20 16:08:05 crc kubenswrapper[4675]: I0320 16:08:05.448375 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-2ct4l" Mar 20 16:08:16 crc kubenswrapper[4675]: I0320 16:08:16.323685 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xb69f" Mar 20 16:08:16 crc kubenswrapper[4675]: I0320 16:08:16.390912 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mz2tv"] Mar 20 16:08:34 crc kubenswrapper[4675]: I0320 16:08:34.427632 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:08:34 crc kubenswrapper[4675]: I0320 16:08:34.428528 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:08:41 crc kubenswrapper[4675]: I0320 16:08:41.445864 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" podUID="7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" containerName="registry" containerID="cri-o://669672e347272c8e516eebbf132a7f42388c9f8cdb077afe28f3e30743575b62" gracePeriod=30 Mar 20 16:08:41 crc kubenswrapper[4675]: I0320 16:08:41.708023 4675 generic.go:334] "Generic (PLEG): container finished" podID="7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" containerID="669672e347272c8e516eebbf132a7f42388c9f8cdb077afe28f3e30743575b62" exitCode=0 Mar 20 16:08:41 crc kubenswrapper[4675]: I0320 16:08:41.708299 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" event={"ID":"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461","Type":"ContainerDied","Data":"669672e347272c8e516eebbf132a7f42388c9f8cdb077afe28f3e30743575b62"} Mar 20 16:08:41 crc kubenswrapper[4675]: I0320 16:08:41.816050 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.009044 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-certificates\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.009457 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-ca-trust-extracted\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.009848 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2ksw\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-kube-api-access-s2ksw\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.009963 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-tls\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.010017 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-bound-sa-token\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.010077 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-installation-pull-secrets\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.010193 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-trusted-ca\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.010443 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\" (UID: \"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461\") " Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.010750 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.010917 4675 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.012430 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.017855 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.018741 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-kube-api-access-s2ksw" (OuterVolumeSpecName: "kube-api-access-s2ksw") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "kube-api-access-s2ksw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.019668 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.029675 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.031024 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.044993 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" (UID: "7b03bbc4-2ec2-4f58-bd4d-20770e1fb461"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.112248 4675 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.112308 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2ksw\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-kube-api-access-s2ksw\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.112331 4675 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.112349 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.112370 4675 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.112398 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.718447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" event={"ID":"7b03bbc4-2ec2-4f58-bd4d-20770e1fb461","Type":"ContainerDied","Data":"a08847d75860dc064cc57090f6de4f351c7d20245460f43839502b5ea2c0a60c"} Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.718533 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mz2tv" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.718743 4675 scope.go:117] "RemoveContainer" containerID="669672e347272c8e516eebbf132a7f42388c9f8cdb077afe28f3e30743575b62" Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.744743 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mz2tv"] Mar 20 16:08:42 crc kubenswrapper[4675]: I0320 16:08:42.750362 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mz2tv"] Mar 20 16:08:44 crc kubenswrapper[4675]: I0320 16:08:44.688874 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" path="/var/lib/kubelet/pods/7b03bbc4-2ec2-4f58-bd4d-20770e1fb461/volumes" Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.424651 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.425198 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.425242 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.425797 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67b506092b500b63a2b5d18168ce40f7a503401a387fe638308c8230ccbef555"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.425842 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://67b506092b500b63a2b5d18168ce40f7a503401a387fe638308c8230ccbef555" gracePeriod=600 Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.871695 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="67b506092b500b63a2b5d18168ce40f7a503401a387fe638308c8230ccbef555" exitCode=0 Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.871904 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"67b506092b500b63a2b5d18168ce40f7a503401a387fe638308c8230ccbef555"} Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.872126 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"c758d44584e0697febf8722629cecc7107d8ebc056ab2b31afed6d9fd7d43fbf"} Mar 20 16:09:04 crc kubenswrapper[4675]: I0320 16:09:04.872150 4675 scope.go:117] "RemoveContainer" containerID="d59323867d724f5bf266854202b2400e5096f3c3706caf7907ceb0a8b81e9995" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.830023 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwsgc"] Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.831047 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bwsgc" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="registry-server" containerID="cri-o://35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa" gracePeriod=30 Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.849293 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7n9r9"] Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.852261 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gnrqz"] Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.852483 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7n9r9" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="registry-server" containerID="cri-o://ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12" gracePeriod=30 Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.852653 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" containerID="cri-o://34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55" gracePeriod=30 Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.869876 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8lpk"] Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.870137 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k8lpk" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="registry-server" containerID="cri-o://b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a" gracePeriod=30 Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.873872 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xhj8r"] Mar 20 16:09:11 crc kubenswrapper[4675]: E0320 16:09:11.874090 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" containerName="registry" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.874110 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" containerName="registry" Mar 20 16:09:11 crc kubenswrapper[4675]: E0320 16:09:11.874134 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7176cd-8e2e-4a66-81e4-41b8502d8813" containerName="oc" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.874143 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7176cd-8e2e-4a66-81e4-41b8502d8813" containerName="oc" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.874270 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7176cd-8e2e-4a66-81e4-41b8502d8813" containerName="oc" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.874285 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b03bbc4-2ec2-4f58-bd4d-20770e1fb461" containerName="registry" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.874647 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.878147 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-blmfl"] Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.878392 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-blmfl" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="registry-server" containerID="cri-o://014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43" gracePeriod=30 Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.899546 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xhj8r"] Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.937104 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.937141 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:11 crc kubenswrapper[4675]: I0320 16:09:11.937170 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-kube-api-access-n72nb\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.039970 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.040008 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.040031 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-kube-api-access-n72nb\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.041203 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.050260 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.055363 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/d62ebd56-8ca5-4cbe-b7af-15d2164540fe-kube-api-access-n72nb\") pod \"marketplace-operator-79b997595-xhj8r\" (UID: \"d62ebd56-8ca5-4cbe-b7af-15d2164540fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.210041 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.309050 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.338812 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.339314 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.349960 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.377519 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454028 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-utilities\") pod \"4d82201b-fc6f-4776-87f3-7cf89822bda5\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454103 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltc8d\" (UniqueName: \"kubernetes.io/projected/239989a6-b2d8-4061-88d4-a8a6a656fe6b-kube-api-access-ltc8d\") pod \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454129 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-utilities\") pod \"79c663f2-11ef-4c23-ac68-b8bb32997e77\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454151 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-trusted-ca\") pod \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454167 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-catalog-content\") pod \"79c663f2-11ef-4c23-ac68-b8bb32997e77\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454184 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz6lt\" (UniqueName: \"kubernetes.io/projected/c311c63c-0f7e-4435-a2e3-fbc85a59594e-kube-api-access-lz6lt\") pod \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454214 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-catalog-content\") pod \"4d82201b-fc6f-4776-87f3-7cf89822bda5\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454230 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-utilities\") pod \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454246 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-operator-metrics\") pod \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\" (UID: \"c311c63c-0f7e-4435-a2e3-fbc85a59594e\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454311 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvc5m\" (UniqueName: \"kubernetes.io/projected/79c663f2-11ef-4c23-ac68-b8bb32997e77-kube-api-access-tvc5m\") pod \"79c663f2-11ef-4c23-ac68-b8bb32997e77\" (UID: \"79c663f2-11ef-4c23-ac68-b8bb32997e77\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454330 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-catalog-content\") pod \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\" (UID: \"239989a6-b2d8-4061-88d4-a8a6a656fe6b\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454350 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-catalog-content\") pod \"ac02846f-d933-4f76-9085-19a28023c633\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454369 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spbpv\" (UniqueName: \"kubernetes.io/projected/ac02846f-d933-4f76-9085-19a28023c633-kube-api-access-spbpv\") pod \"ac02846f-d933-4f76-9085-19a28023c633\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454391 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-utilities\") pod \"ac02846f-d933-4f76-9085-19a28023c633\" (UID: \"ac02846f-d933-4f76-9085-19a28023c633\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.454419 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szdg\" (UniqueName: \"kubernetes.io/projected/4d82201b-fc6f-4776-87f3-7cf89822bda5-kube-api-access-8szdg\") pod \"4d82201b-fc6f-4776-87f3-7cf89822bda5\" (UID: \"4d82201b-fc6f-4776-87f3-7cf89822bda5\") " Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.455167 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-utilities" (OuterVolumeSpecName: "utilities") pod "4d82201b-fc6f-4776-87f3-7cf89822bda5" (UID: "4d82201b-fc6f-4776-87f3-7cf89822bda5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.455480 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-utilities" (OuterVolumeSpecName: "utilities") pod "239989a6-b2d8-4061-88d4-a8a6a656fe6b" (UID: "239989a6-b2d8-4061-88d4-a8a6a656fe6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.456662 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-utilities" (OuterVolumeSpecName: "utilities") pod "79c663f2-11ef-4c23-ac68-b8bb32997e77" (UID: "79c663f2-11ef-4c23-ac68-b8bb32997e77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.456727 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-utilities" (OuterVolumeSpecName: "utilities") pod "ac02846f-d933-4f76-9085-19a28023c633" (UID: "ac02846f-d933-4f76-9085-19a28023c633"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.456743 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c311c63c-0f7e-4435-a2e3-fbc85a59594e" (UID: "c311c63c-0f7e-4435-a2e3-fbc85a59594e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.460232 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d82201b-fc6f-4776-87f3-7cf89822bda5-kube-api-access-8szdg" (OuterVolumeSpecName: "kube-api-access-8szdg") pod "4d82201b-fc6f-4776-87f3-7cf89822bda5" (UID: "4d82201b-fc6f-4776-87f3-7cf89822bda5"). InnerVolumeSpecName "kube-api-access-8szdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.460461 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239989a6-b2d8-4061-88d4-a8a6a656fe6b-kube-api-access-ltc8d" (OuterVolumeSpecName: "kube-api-access-ltc8d") pod "239989a6-b2d8-4061-88d4-a8a6a656fe6b" (UID: "239989a6-b2d8-4061-88d4-a8a6a656fe6b"). InnerVolumeSpecName "kube-api-access-ltc8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.460569 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac02846f-d933-4f76-9085-19a28023c633-kube-api-access-spbpv" (OuterVolumeSpecName: "kube-api-access-spbpv") pod "ac02846f-d933-4f76-9085-19a28023c633" (UID: "ac02846f-d933-4f76-9085-19a28023c633"). InnerVolumeSpecName "kube-api-access-spbpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.460586 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c663f2-11ef-4c23-ac68-b8bb32997e77-kube-api-access-tvc5m" (OuterVolumeSpecName: "kube-api-access-tvc5m") pod "79c663f2-11ef-4c23-ac68-b8bb32997e77" (UID: "79c663f2-11ef-4c23-ac68-b8bb32997e77"). InnerVolumeSpecName "kube-api-access-tvc5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.463701 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c311c63c-0f7e-4435-a2e3-fbc85a59594e-kube-api-access-lz6lt" (OuterVolumeSpecName: "kube-api-access-lz6lt") pod "c311c63c-0f7e-4435-a2e3-fbc85a59594e" (UID: "c311c63c-0f7e-4435-a2e3-fbc85a59594e"). InnerVolumeSpecName "kube-api-access-lz6lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.468876 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c311c63c-0f7e-4435-a2e3-fbc85a59594e" (UID: "c311c63c-0f7e-4435-a2e3-fbc85a59594e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.497583 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d82201b-fc6f-4776-87f3-7cf89822bda5" (UID: "4d82201b-fc6f-4776-87f3-7cf89822bda5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.512638 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79c663f2-11ef-4c23-ac68-b8bb32997e77" (UID: "79c663f2-11ef-4c23-ac68-b8bb32997e77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.539074 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "239989a6-b2d8-4061-88d4-a8a6a656fe6b" (UID: "239989a6-b2d8-4061-88d4-a8a6a656fe6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555670 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8szdg\" (UniqueName: \"kubernetes.io/projected/4d82201b-fc6f-4776-87f3-7cf89822bda5-kube-api-access-8szdg\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555709 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555720 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltc8d\" (UniqueName: \"kubernetes.io/projected/239989a6-b2d8-4061-88d4-a8a6a656fe6b-kube-api-access-ltc8d\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555730 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555740 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555755 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c663f2-11ef-4c23-ac68-b8bb32997e77-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555787 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz6lt\" (UniqueName: \"kubernetes.io/projected/c311c63c-0f7e-4435-a2e3-fbc85a59594e-kube-api-access-lz6lt\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555799 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d82201b-fc6f-4776-87f3-7cf89822bda5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555810 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555822 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c311c63c-0f7e-4435-a2e3-fbc85a59594e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555833 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvc5m\" (UniqueName: \"kubernetes.io/projected/79c663f2-11ef-4c23-ac68-b8bb32997e77-kube-api-access-tvc5m\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555844 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239989a6-b2d8-4061-88d4-a8a6a656fe6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555855 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spbpv\" (UniqueName: \"kubernetes.io/projected/ac02846f-d933-4f76-9085-19a28023c633-kube-api-access-spbpv\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.555865 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.618607 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac02846f-d933-4f76-9085-19a28023c633" (UID: "ac02846f-d933-4f76-9085-19a28023c633"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.656603 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02846f-d933-4f76-9085-19a28023c633-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.658537 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xhj8r"] Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.928879 4675 generic.go:334] "Generic (PLEG): container finished" podID="ac02846f-d933-4f76-9085-19a28023c633" containerID="014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43" exitCode=0 Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.928942 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blmfl" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.928953 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerDied","Data":"014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.929335 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blmfl" event={"ID":"ac02846f-d933-4f76-9085-19a28023c633","Type":"ContainerDied","Data":"0da9a7833586d5e50ace11f91c22665d360749d0c5e13d8f3865e2358a214a03"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.929369 4675 scope.go:117] "RemoveContainer" containerID="014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.933951 4675 generic.go:334] "Generic (PLEG): container finished" podID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerID="35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa" exitCode=0 Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.934002 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerDied","Data":"35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.934032 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwsgc" event={"ID":"239989a6-b2d8-4061-88d4-a8a6a656fe6b","Type":"ContainerDied","Data":"58b2ab3ecae9230ac0ea02a62cab9cfa9804736cd315d7fdf103f286d15e5a22"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.934086 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwsgc" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.936887 4675 generic.go:334] "Generic (PLEG): container finished" podID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerID="b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a" exitCode=0 Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.936929 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8lpk" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.936961 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerDied","Data":"b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.937033 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8lpk" event={"ID":"4d82201b-fc6f-4776-87f3-7cf89822bda5","Type":"ContainerDied","Data":"dc0965d4a3bd1e1d3639e1beef1975329b5d90351313e41c22a6ef00ea2a833e"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.945596 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" event={"ID":"d62ebd56-8ca5-4cbe-b7af-15d2164540fe","Type":"ContainerStarted","Data":"e18a64b333dd55a976ba8994391c1d329cac0b946906b2d45b6af442b1849824"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.945650 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" event={"ID":"d62ebd56-8ca5-4cbe-b7af-15d2164540fe","Type":"ContainerStarted","Data":"26e71ac05f5b301b7aab9528f6d5b571a5ca0ca4bbf327d47aaabd6a98315d66"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.949967 4675 generic.go:334] "Generic (PLEG): container finished" podID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerID="34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55" exitCode=0 Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.950040 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.950038 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" event={"ID":"c311c63c-0f7e-4435-a2e3-fbc85a59594e","Type":"ContainerDied","Data":"34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.950091 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gnrqz" event={"ID":"c311c63c-0f7e-4435-a2e3-fbc85a59594e","Type":"ContainerDied","Data":"b01840b87a401000182470eba216332e1a267fdc769e78e6bb5f9e53cda67173"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.950642 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-blmfl"] Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.954407 4675 scope.go:117] "RemoveContainer" containerID="35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.954793 4675 generic.go:334] "Generic (PLEG): container finished" podID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerID="ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12" exitCode=0 Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.954833 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7n9r9" event={"ID":"79c663f2-11ef-4c23-ac68-b8bb32997e77","Type":"ContainerDied","Data":"ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.954862 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7n9r9" event={"ID":"79c663f2-11ef-4c23-ac68-b8bb32997e77","Type":"ContainerDied","Data":"c25c8a92963b364260b28534c9ac2f7cb175f07c14af9b01eafcc5ce938cb09e"} Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.954933 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7n9r9" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.959177 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-blmfl"] Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.975411 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8lpk"] Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.978971 4675 scope.go:117] "RemoveContainer" containerID="165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516" Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.983360 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8lpk"] Mar 20 16:09:12 crc kubenswrapper[4675]: I0320 16:09:12.997190 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" podStartSLOduration=1.997172001 podStartE2EDuration="1.997172001s" podCreationTimestamp="2026-03-20 16:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:09:12.994658999 +0000 UTC m=+473.028288536" watchObservedRunningTime="2026-03-20 16:09:12.997172001 +0000 UTC m=+473.030801538" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.009297 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gnrqz"] Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.012980 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gnrqz"] Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.023649 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwsgc"] Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.031309 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bwsgc"] Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.041442 4675 scope.go:117] "RemoveContainer" containerID="014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.041972 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43\": container with ID starting with 014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43 not found: ID does not exist" containerID="014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.042012 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43"} err="failed to get container status \"014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43\": rpc error: code = NotFound desc = could not find container \"014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43\": container with ID starting with 014d9b1c658bdb208c1924fcd3c255d6e63fad0f7e43adf41bf7745b85178f43 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.042038 4675 scope.go:117] "RemoveContainer" containerID="35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.043347 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7n9r9"] Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.043831 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748\": container with ID starting with 35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748 not found: ID does not exist" containerID="35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.043870 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748"} err="failed to get container status \"35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748\": rpc error: code = NotFound desc = could not find container \"35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748\": container with ID starting with 35f32a8a37a9876395ae42182463bcfc7d4fdb2f1e23ab9d0ac4db073a9a1748 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.043898 4675 scope.go:117] "RemoveContainer" containerID="165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.044256 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516\": container with ID starting with 165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516 not found: ID does not exist" containerID="165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.044307 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516"} err="failed to get container status \"165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516\": rpc error: code = NotFound desc = could not find container \"165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516\": container with ID starting with 165bc37589e3296a96f102a8d689c4c951c651c09e5372469386f87d1eb7e516 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.044337 4675 scope.go:117] "RemoveContainer" containerID="35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.046910 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7n9r9"] Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.060029 4675 scope.go:117] "RemoveContainer" containerID="6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.077238 4675 scope.go:117] "RemoveContainer" containerID="0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.099002 4675 scope.go:117] "RemoveContainer" containerID="35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.100218 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa\": container with ID starting with 35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa not found: ID does not exist" containerID="35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.100292 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa"} err="failed to get container status \"35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa\": rpc error: code = NotFound desc = could not find container \"35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa\": container with ID starting with 35dbe9e0eef467e2585d253b5db2c77cf8fa24ece76f4a015e69c8c3f9d160aa not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.100560 4675 scope.go:117] "RemoveContainer" containerID="6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.101967 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2\": container with ID starting with 6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2 not found: ID does not exist" containerID="6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.102014 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2"} err="failed to get container status \"6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2\": rpc error: code = NotFound desc = could not find container \"6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2\": container with ID starting with 6fbf0ee782419d9d4210753c11a5ad9fc52bb6842f7cf291b419aa61ae249dd2 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.102046 4675 scope.go:117] "RemoveContainer" containerID="0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.102504 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb\": container with ID starting with 0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb not found: ID does not exist" containerID="0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.102547 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb"} err="failed to get container status \"0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb\": rpc error: code = NotFound desc = could not find container \"0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb\": container with ID starting with 0957fa0d74e20d10ffd204f887ea104864dbf73fcde84e82fdb00158fda915bb not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.102573 4675 scope.go:117] "RemoveContainer" containerID="b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.123494 4675 scope.go:117] "RemoveContainer" containerID="767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.138200 4675 scope.go:117] "RemoveContainer" containerID="ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.160160 4675 scope.go:117] "RemoveContainer" containerID="b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.160760 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a\": container with ID starting with b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a not found: ID does not exist" containerID="b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.160827 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a"} err="failed to get container status \"b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a\": rpc error: code = NotFound desc = could not find container \"b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a\": container with ID starting with b1d1c9cd5993077c1c9f31dffa83d6bbe24d7854902a8daa0823d5df66e9c73a not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.160877 4675 scope.go:117] "RemoveContainer" containerID="767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.161327 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6\": container with ID starting with 767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6 not found: ID does not exist" containerID="767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.161369 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6"} err="failed to get container status \"767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6\": rpc error: code = NotFound desc = could not find container \"767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6\": container with ID starting with 767d31e1028f489f6fdbc7b1a8c766e4b929aec3efa1b2cf1adb61f0bc804fa6 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.161398 4675 scope.go:117] "RemoveContainer" containerID="ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.161654 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc\": container with ID starting with ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc not found: ID does not exist" containerID="ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.161698 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc"} err="failed to get container status \"ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc\": rpc error: code = NotFound desc = could not find container \"ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc\": container with ID starting with ca5b56137f4e97b95e638a7393d890872e230dc6ba853000f04ca4f7bb1ba3bc not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.161724 4675 scope.go:117] "RemoveContainer" containerID="34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.173227 4675 scope.go:117] "RemoveContainer" containerID="34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.173921 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55\": container with ID starting with 34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55 not found: ID does not exist" containerID="34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.173976 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55"} err="failed to get container status \"34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55\": rpc error: code = NotFound desc = could not find container \"34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55\": container with ID starting with 34904c26aa99b1045b6f0cc67abbe891b673eb70629328755eb2146e11c3dc55 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.174013 4675 scope.go:117] "RemoveContainer" containerID="ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.187465 4675 scope.go:117] "RemoveContainer" containerID="16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.200822 4675 scope.go:117] "RemoveContainer" containerID="634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.212548 4675 scope.go:117] "RemoveContainer" containerID="ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.212863 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12\": container with ID starting with ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12 not found: ID does not exist" containerID="ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.212889 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12"} err="failed to get container status \"ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12\": rpc error: code = NotFound desc = could not find container \"ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12\": container with ID starting with ba85f7e0ffb68f73aec69622dce14ddd1498d145435eb90f9f89bd475d1eec12 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.212909 4675 scope.go:117] "RemoveContainer" containerID="16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.213346 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273\": container with ID starting with 16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273 not found: ID does not exist" containerID="16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.213396 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273"} err="failed to get container status \"16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273\": rpc error: code = NotFound desc = could not find container \"16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273\": container with ID starting with 16ed3e22879743f5170e4f40c44db6499fcf9db49b97d092ff5fd0a3845d5273 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.213431 4675 scope.go:117] "RemoveContainer" containerID="634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950" Mar 20 16:09:13 crc kubenswrapper[4675]: E0320 16:09:13.213715 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950\": container with ID starting with 634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950 not found: ID does not exist" containerID="634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.213735 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950"} err="failed to get container status \"634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950\": rpc error: code = NotFound desc = could not find container \"634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950\": container with ID starting with 634a0c1251d3e35fff2bdaee20a93cd17a90907963595617f65a761dbf2b7950 not found: ID does not exist" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.974247 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:13 crc kubenswrapper[4675]: I0320 16:09:13.978374 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xhj8r" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055316 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ml97"] Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055592 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055619 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055641 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055653 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055664 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055676 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055694 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055705 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055717 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055727 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055746 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055758 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055800 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055812 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055830 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055841 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055855 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055865 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055879 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055889 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055905 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055916 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055930 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055941 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="extract-utilities" Mar 20 16:09:14 crc kubenswrapper[4675]: E0320 16:09:14.055954 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.055964 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="extract-content" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.056108 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" containerName="marketplace-operator" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.056126 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac02846f-d933-4f76-9085-19a28023c633" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.056193 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.056210 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.056226 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" containerName="registry-server" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.057401 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.059613 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.075327 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7348ed2-98c6-4d2f-b738-98961986accc-catalog-content\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.075629 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjxx\" (UniqueName: \"kubernetes.io/projected/a7348ed2-98c6-4d2f-b738-98961986accc-kube-api-access-qhjxx\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.075745 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7348ed2-98c6-4d2f-b738-98961986accc-utilities\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.081724 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ml97"] Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.177367 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7348ed2-98c6-4d2f-b738-98961986accc-catalog-content\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.177469 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjxx\" (UniqueName: \"kubernetes.io/projected/a7348ed2-98c6-4d2f-b738-98961986accc-kube-api-access-qhjxx\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.177495 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7348ed2-98c6-4d2f-b738-98961986accc-utilities\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.178113 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7348ed2-98c6-4d2f-b738-98961986accc-catalog-content\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.178138 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7348ed2-98c6-4d2f-b738-98961986accc-utilities\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.200612 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjxx\" (UniqueName: \"kubernetes.io/projected/a7348ed2-98c6-4d2f-b738-98961986accc-kube-api-access-qhjxx\") pod \"redhat-marketplace-4ml97\" (UID: \"a7348ed2-98c6-4d2f-b738-98961986accc\") " pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.256127 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pgdms"] Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.257628 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.263405 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.268299 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgdms"] Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.283427 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86l75\" (UniqueName: \"kubernetes.io/projected/ae86b573-cc65-4b83-b812-9b74eaefbe62-kube-api-access-86l75\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.283509 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae86b573-cc65-4b83-b812-9b74eaefbe62-catalog-content\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.283586 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae86b573-cc65-4b83-b812-9b74eaefbe62-utilities\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.376380 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.384992 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86l75\" (UniqueName: \"kubernetes.io/projected/ae86b573-cc65-4b83-b812-9b74eaefbe62-kube-api-access-86l75\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.385028 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae86b573-cc65-4b83-b812-9b74eaefbe62-catalog-content\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.385096 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae86b573-cc65-4b83-b812-9b74eaefbe62-utilities\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.385565 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae86b573-cc65-4b83-b812-9b74eaefbe62-utilities\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.385702 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae86b573-cc65-4b83-b812-9b74eaefbe62-catalog-content\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.411162 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86l75\" (UniqueName: \"kubernetes.io/projected/ae86b573-cc65-4b83-b812-9b74eaefbe62-kube-api-access-86l75\") pod \"redhat-operators-pgdms\" (UID: \"ae86b573-cc65-4b83-b812-9b74eaefbe62\") " pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.584149 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.684533 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239989a6-b2d8-4061-88d4-a8a6a656fe6b" path="/var/lib/kubelet/pods/239989a6-b2d8-4061-88d4-a8a6a656fe6b/volumes" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.685617 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d82201b-fc6f-4776-87f3-7cf89822bda5" path="/var/lib/kubelet/pods/4d82201b-fc6f-4776-87f3-7cf89822bda5/volumes" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.686409 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c663f2-11ef-4c23-ac68-b8bb32997e77" path="/var/lib/kubelet/pods/79c663f2-11ef-4c23-ac68-b8bb32997e77/volumes" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.687842 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac02846f-d933-4f76-9085-19a28023c633" path="/var/lib/kubelet/pods/ac02846f-d933-4f76-9085-19a28023c633/volumes" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.688629 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c311c63c-0f7e-4435-a2e3-fbc85a59594e" path="/var/lib/kubelet/pods/c311c63c-0f7e-4435-a2e3-fbc85a59594e/volumes" Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.822650 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ml97"] Mar 20 16:09:14 crc kubenswrapper[4675]: W0320 16:09:14.829332 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7348ed2_98c6_4d2f_b738_98961986accc.slice/crio-eeffeb09f47e180fb244b35e58e305ed89026e9eede06423cb032bdc21d76231 WatchSource:0}: Error finding container eeffeb09f47e180fb244b35e58e305ed89026e9eede06423cb032bdc21d76231: Status 404 returned error can't find the container with id eeffeb09f47e180fb244b35e58e305ed89026e9eede06423cb032bdc21d76231 Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.981218 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ml97" event={"ID":"a7348ed2-98c6-4d2f-b738-98961986accc","Type":"ContainerStarted","Data":"1ad8f6172823ce75f48255be14d3df338aecd1b78230508cefb4bd8e95fa4dc2"} Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.982845 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ml97" event={"ID":"a7348ed2-98c6-4d2f-b738-98961986accc","Type":"ContainerStarted","Data":"eeffeb09f47e180fb244b35e58e305ed89026e9eede06423cb032bdc21d76231"} Mar 20 16:09:14 crc kubenswrapper[4675]: I0320 16:09:14.998800 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgdms"] Mar 20 16:09:15 crc kubenswrapper[4675]: I0320 16:09:15.989903 4675 generic.go:334] "Generic (PLEG): container finished" podID="a7348ed2-98c6-4d2f-b738-98961986accc" containerID="1ad8f6172823ce75f48255be14d3df338aecd1b78230508cefb4bd8e95fa4dc2" exitCode=0 Mar 20 16:09:15 crc kubenswrapper[4675]: I0320 16:09:15.989969 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ml97" event={"ID":"a7348ed2-98c6-4d2f-b738-98961986accc","Type":"ContainerDied","Data":"1ad8f6172823ce75f48255be14d3df338aecd1b78230508cefb4bd8e95fa4dc2"} Mar 20 16:09:15 crc kubenswrapper[4675]: I0320 16:09:15.993617 4675 generic.go:334] "Generic (PLEG): container finished" podID="ae86b573-cc65-4b83-b812-9b74eaefbe62" containerID="017f0cfbe565030f2ade3dea59ac6d477b431544fc38837d63e384ab701da5ef" exitCode=0 Mar 20 16:09:15 crc kubenswrapper[4675]: I0320 16:09:15.994097 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgdms" event={"ID":"ae86b573-cc65-4b83-b812-9b74eaefbe62","Type":"ContainerDied","Data":"017f0cfbe565030f2ade3dea59ac6d477b431544fc38837d63e384ab701da5ef"} Mar 20 16:09:15 crc kubenswrapper[4675]: I0320 16:09:15.994147 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgdms" event={"ID":"ae86b573-cc65-4b83-b812-9b74eaefbe62","Type":"ContainerStarted","Data":"3aec40bfa54cd8f70a2985dd81e871b7fa0ed8e79950d22c675d9cc1c6137abd"} Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.471042 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6fszb"] Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.473606 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.476758 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.481177 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fszb"] Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.514657 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2420ca6e-21bb-4804-b714-b0ac748a5d4a-catalog-content\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.514718 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prm2\" (UniqueName: \"kubernetes.io/projected/2420ca6e-21bb-4804-b714-b0ac748a5d4a-kube-api-access-6prm2\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.514896 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2420ca6e-21bb-4804-b714-b0ac748a5d4a-utilities\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.616171 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prm2\" (UniqueName: \"kubernetes.io/projected/2420ca6e-21bb-4804-b714-b0ac748a5d4a-kube-api-access-6prm2\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.616328 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2420ca6e-21bb-4804-b714-b0ac748a5d4a-utilities\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.616418 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2420ca6e-21bb-4804-b714-b0ac748a5d4a-catalog-content\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.616938 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2420ca6e-21bb-4804-b714-b0ac748a5d4a-catalog-content\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.616941 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2420ca6e-21bb-4804-b714-b0ac748a5d4a-utilities\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.646815 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prm2\" (UniqueName: \"kubernetes.io/projected/2420ca6e-21bb-4804-b714-b0ac748a5d4a-kube-api-access-6prm2\") pod \"certified-operators-6fszb\" (UID: \"2420ca6e-21bb-4804-b714-b0ac748a5d4a\") " pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.666928 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvl9k"] Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.667910 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.671158 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.672178 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvl9k"] Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.717250 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d94b5a9e-b23c-497a-befb-9e5df7a05b76-utilities\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.717476 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d94b5a9e-b23c-497a-befb-9e5df7a05b76-catalog-content\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.717681 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htffv\" (UniqueName: \"kubernetes.io/projected/d94b5a9e-b23c-497a-befb-9e5df7a05b76-kube-api-access-htffv\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.801911 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.819379 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htffv\" (UniqueName: \"kubernetes.io/projected/d94b5a9e-b23c-497a-befb-9e5df7a05b76-kube-api-access-htffv\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.819534 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d94b5a9e-b23c-497a-befb-9e5df7a05b76-utilities\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.819828 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d94b5a9e-b23c-497a-befb-9e5df7a05b76-catalog-content\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.819938 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d94b5a9e-b23c-497a-befb-9e5df7a05b76-utilities\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.820231 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d94b5a9e-b23c-497a-befb-9e5df7a05b76-catalog-content\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.845492 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htffv\" (UniqueName: \"kubernetes.io/projected/d94b5a9e-b23c-497a-befb-9e5df7a05b76-kube-api-access-htffv\") pod \"community-operators-rvl9k\" (UID: \"d94b5a9e-b23c-497a-befb-9e5df7a05b76\") " pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:16 crc kubenswrapper[4675]: I0320 16:09:16.985470 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:17 crc kubenswrapper[4675]: I0320 16:09:17.001520 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgdms" event={"ID":"ae86b573-cc65-4b83-b812-9b74eaefbe62","Type":"ContainerStarted","Data":"15ce8d790c13962c4fd681138abbe689806ff70a37ad075def6fe9b2e4fd24e5"} Mar 20 16:09:17 crc kubenswrapper[4675]: I0320 16:09:17.208979 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fszb"] Mar 20 16:09:17 crc kubenswrapper[4675]: W0320 16:09:17.212682 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2420ca6e_21bb_4804_b714_b0ac748a5d4a.slice/crio-9189f6d8ba4e9f4fe4dd1b9eeb78d8b4efb5df0f4ba85690f465a4a6a5757f8b WatchSource:0}: Error finding container 9189f6d8ba4e9f4fe4dd1b9eeb78d8b4efb5df0f4ba85690f465a4a6a5757f8b: Status 404 returned error can't find the container with id 9189f6d8ba4e9f4fe4dd1b9eeb78d8b4efb5df0f4ba85690f465a4a6a5757f8b Mar 20 16:09:17 crc kubenswrapper[4675]: I0320 16:09:17.402288 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvl9k"] Mar 20 16:09:17 crc kubenswrapper[4675]: W0320 16:09:17.453244 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd94b5a9e_b23c_497a_befb_9e5df7a05b76.slice/crio-2fd7e88ea1e3a47b5c17be726999597c113753b7710e18c372df85c223405c82 WatchSource:0}: Error finding container 2fd7e88ea1e3a47b5c17be726999597c113753b7710e18c372df85c223405c82: Status 404 returned error can't find the container with id 2fd7e88ea1e3a47b5c17be726999597c113753b7710e18c372df85c223405c82 Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.012913 4675 generic.go:334] "Generic (PLEG): container finished" podID="ae86b573-cc65-4b83-b812-9b74eaefbe62" containerID="15ce8d790c13962c4fd681138abbe689806ff70a37ad075def6fe9b2e4fd24e5" exitCode=0 Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.012981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgdms" event={"ID":"ae86b573-cc65-4b83-b812-9b74eaefbe62","Type":"ContainerDied","Data":"15ce8d790c13962c4fd681138abbe689806ff70a37ad075def6fe9b2e4fd24e5"} Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.018442 4675 generic.go:334] "Generic (PLEG): container finished" podID="a7348ed2-98c6-4d2f-b738-98961986accc" containerID="cc09b5adaeab1e068ae9cac9e04c7150e09a963c117602ffa242a5b6ee5427f6" exitCode=0 Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.020703 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ml97" event={"ID":"a7348ed2-98c6-4d2f-b738-98961986accc","Type":"ContainerDied","Data":"cc09b5adaeab1e068ae9cac9e04c7150e09a963c117602ffa242a5b6ee5427f6"} Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.028287 4675 generic.go:334] "Generic (PLEG): container finished" podID="2420ca6e-21bb-4804-b714-b0ac748a5d4a" containerID="e3f82635d66818cd1050b8259f7fe77673a169f02ecc6e8a705b97c7e27a673e" exitCode=0 Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.028341 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fszb" event={"ID":"2420ca6e-21bb-4804-b714-b0ac748a5d4a","Type":"ContainerDied","Data":"e3f82635d66818cd1050b8259f7fe77673a169f02ecc6e8a705b97c7e27a673e"} Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.028361 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fszb" event={"ID":"2420ca6e-21bb-4804-b714-b0ac748a5d4a","Type":"ContainerStarted","Data":"9189f6d8ba4e9f4fe4dd1b9eeb78d8b4efb5df0f4ba85690f465a4a6a5757f8b"} Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.034494 4675 generic.go:334] "Generic (PLEG): container finished" podID="d94b5a9e-b23c-497a-befb-9e5df7a05b76" containerID="f3e326a078aa8aba94ce3e82fb1724975f5b99b6a763cfd4b0496c28e8747179" exitCode=0 Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.034537 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvl9k" event={"ID":"d94b5a9e-b23c-497a-befb-9e5df7a05b76","Type":"ContainerDied","Data":"f3e326a078aa8aba94ce3e82fb1724975f5b99b6a763cfd4b0496c28e8747179"} Mar 20 16:09:18 crc kubenswrapper[4675]: I0320 16:09:18.034571 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvl9k" event={"ID":"d94b5a9e-b23c-497a-befb-9e5df7a05b76","Type":"ContainerStarted","Data":"2fd7e88ea1e3a47b5c17be726999597c113753b7710e18c372df85c223405c82"} Mar 20 16:09:19 crc kubenswrapper[4675]: I0320 16:09:19.040691 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgdms" event={"ID":"ae86b573-cc65-4b83-b812-9b74eaefbe62","Type":"ContainerStarted","Data":"97bf6f579b658590942f8db2e385e5f4fe35654611b5f57b0a1093c4af166d09"} Mar 20 16:09:19 crc kubenswrapper[4675]: I0320 16:09:19.043290 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ml97" event={"ID":"a7348ed2-98c6-4d2f-b738-98961986accc","Type":"ContainerStarted","Data":"2709769bad2a69d1ace49163c6cd2cf294048f648e6fb661706bd7cd60f0f91c"} Mar 20 16:09:19 crc kubenswrapper[4675]: I0320 16:09:19.062034 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pgdms" podStartSLOduration=2.528633064 podStartE2EDuration="5.061993944s" podCreationTimestamp="2026-03-20 16:09:14 +0000 UTC" firstStartedPulling="2026-03-20 16:09:15.995038247 +0000 UTC m=+476.028667784" lastFinishedPulling="2026-03-20 16:09:18.528399067 +0000 UTC m=+478.562028664" observedRunningTime="2026-03-20 16:09:19.060018367 +0000 UTC m=+479.093647904" watchObservedRunningTime="2026-03-20 16:09:19.061993944 +0000 UTC m=+479.095623481" Mar 20 16:09:19 crc kubenswrapper[4675]: I0320 16:09:19.076573 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ml97" podStartSLOduration=2.387921326 podStartE2EDuration="5.076560414s" podCreationTimestamp="2026-03-20 16:09:14 +0000 UTC" firstStartedPulling="2026-03-20 16:09:15.992551655 +0000 UTC m=+476.026181192" lastFinishedPulling="2026-03-20 16:09:18.681190733 +0000 UTC m=+478.714820280" observedRunningTime="2026-03-20 16:09:19.075547105 +0000 UTC m=+479.109176632" watchObservedRunningTime="2026-03-20 16:09:19.076560414 +0000 UTC m=+479.110189951" Mar 20 16:09:20 crc kubenswrapper[4675]: I0320 16:09:20.052686 4675 generic.go:334] "Generic (PLEG): container finished" podID="2420ca6e-21bb-4804-b714-b0ac748a5d4a" containerID="e718057f45ba6767d29d7d6d29a01bc04dfdc1aef92e4fd958bc2d40f70b19ee" exitCode=0 Mar 20 16:09:20 crc kubenswrapper[4675]: I0320 16:09:20.052744 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fszb" event={"ID":"2420ca6e-21bb-4804-b714-b0ac748a5d4a","Type":"ContainerDied","Data":"e718057f45ba6767d29d7d6d29a01bc04dfdc1aef92e4fd958bc2d40f70b19ee"} Mar 20 16:09:20 crc kubenswrapper[4675]: I0320 16:09:20.054544 4675 generic.go:334] "Generic (PLEG): container finished" podID="d94b5a9e-b23c-497a-befb-9e5df7a05b76" containerID="e9d74832cdd43fce27a3be4220e0bca25266ddfd5a2116bc3f9056ad84dd0ef9" exitCode=0 Mar 20 16:09:20 crc kubenswrapper[4675]: I0320 16:09:20.054862 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvl9k" event={"ID":"d94b5a9e-b23c-497a-befb-9e5df7a05b76","Type":"ContainerDied","Data":"e9d74832cdd43fce27a3be4220e0bca25266ddfd5a2116bc3f9056ad84dd0ef9"} Mar 20 16:09:21 crc kubenswrapper[4675]: I0320 16:09:21.064064 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fszb" event={"ID":"2420ca6e-21bb-4804-b714-b0ac748a5d4a","Type":"ContainerStarted","Data":"6e9daf1f25e3a18cd428aa5afa339d28d5239fc4aba000c9bf5c67a920efb655"} Mar 20 16:09:21 crc kubenswrapper[4675]: I0320 16:09:21.071361 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvl9k" event={"ID":"d94b5a9e-b23c-497a-befb-9e5df7a05b76","Type":"ContainerStarted","Data":"9b295ccad6ea00950db82902a5dd4effd080c23bd774104c6b623bf2ffac6eff"} Mar 20 16:09:21 crc kubenswrapper[4675]: I0320 16:09:21.079441 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6fszb" podStartSLOduration=2.521362835 podStartE2EDuration="5.079424027s" podCreationTimestamp="2026-03-20 16:09:16 +0000 UTC" firstStartedPulling="2026-03-20 16:09:18.035590707 +0000 UTC m=+478.069220264" lastFinishedPulling="2026-03-20 16:09:20.593651909 +0000 UTC m=+480.627281456" observedRunningTime="2026-03-20 16:09:21.078659595 +0000 UTC m=+481.112289142" watchObservedRunningTime="2026-03-20 16:09:21.079424027 +0000 UTC m=+481.113053574" Mar 20 16:09:21 crc kubenswrapper[4675]: I0320 16:09:21.099308 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvl9k" podStartSLOduration=2.365827249 podStartE2EDuration="5.09929254s" podCreationTimestamp="2026-03-20 16:09:16 +0000 UTC" firstStartedPulling="2026-03-20 16:09:18.036991097 +0000 UTC m=+478.070620644" lastFinishedPulling="2026-03-20 16:09:20.770456388 +0000 UTC m=+480.804085935" observedRunningTime="2026-03-20 16:09:21.095965734 +0000 UTC m=+481.129595291" watchObservedRunningTime="2026-03-20 16:09:21.09929254 +0000 UTC m=+481.132922077" Mar 20 16:09:24 crc kubenswrapper[4675]: I0320 16:09:24.377484 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:24 crc kubenswrapper[4675]: I0320 16:09:24.377854 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:24 crc kubenswrapper[4675]: I0320 16:09:24.458382 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:24 crc kubenswrapper[4675]: I0320 16:09:24.584949 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:24 crc kubenswrapper[4675]: I0320 16:09:24.585023 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:25 crc kubenswrapper[4675]: I0320 16:09:25.144377 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ml97" Mar 20 16:09:25 crc kubenswrapper[4675]: I0320 16:09:25.633382 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgdms" podUID="ae86b573-cc65-4b83-b812-9b74eaefbe62" containerName="registry-server" probeResult="failure" output=< Mar 20 16:09:25 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Mar 20 16:09:25 crc kubenswrapper[4675]: > Mar 20 16:09:26 crc kubenswrapper[4675]: I0320 16:09:26.802850 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:26 crc kubenswrapper[4675]: I0320 16:09:26.802943 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:26 crc kubenswrapper[4675]: I0320 16:09:26.879196 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:26 crc kubenswrapper[4675]: I0320 16:09:26.986176 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:26 crc kubenswrapper[4675]: I0320 16:09:26.986264 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:27 crc kubenswrapper[4675]: I0320 16:09:27.051125 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:27 crc kubenswrapper[4675]: I0320 16:09:27.151676 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvl9k" Mar 20 16:09:27 crc kubenswrapper[4675]: I0320 16:09:27.170809 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6fszb" Mar 20 16:09:34 crc kubenswrapper[4675]: I0320 16:09:34.654248 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:09:34 crc kubenswrapper[4675]: I0320 16:09:34.723903 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pgdms" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.143243 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567050-5pkzv"] Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.144555 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.146741 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.146928 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.147075 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.163393 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-5pkzv"] Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.254873 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrrb\" (UniqueName: \"kubernetes.io/projected/c5a5c91f-fec6-410d-a21f-7cf516d89f78-kube-api-access-dgrrb\") pod \"auto-csr-approver-29567050-5pkzv\" (UID: \"c5a5c91f-fec6-410d-a21f-7cf516d89f78\") " pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.355944 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrrb\" (UniqueName: \"kubernetes.io/projected/c5a5c91f-fec6-410d-a21f-7cf516d89f78-kube-api-access-dgrrb\") pod \"auto-csr-approver-29567050-5pkzv\" (UID: \"c5a5c91f-fec6-410d-a21f-7cf516d89f78\") " pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.376045 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrrb\" (UniqueName: \"kubernetes.io/projected/c5a5c91f-fec6-410d-a21f-7cf516d89f78-kube-api-access-dgrrb\") pod \"auto-csr-approver-29567050-5pkzv\" (UID: \"c5a5c91f-fec6-410d-a21f-7cf516d89f78\") " pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.460032 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.687616 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.689903 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-5pkzv"] Mar 20 16:10:00 crc kubenswrapper[4675]: I0320 16:10:00.967797 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" event={"ID":"c5a5c91f-fec6-410d-a21f-7cf516d89f78","Type":"ContainerStarted","Data":"716e0b55d1d88da633029c358857a9614bf772feb62928c73c95bb5c2f11de8c"} Mar 20 16:10:02 crc kubenswrapper[4675]: I0320 16:10:02.993703 4675 generic.go:334] "Generic (PLEG): container finished" podID="c5a5c91f-fec6-410d-a21f-7cf516d89f78" containerID="cbdae13faf90ce897531798cd7426f8fae027a7002aea03bc349720cc4233a99" exitCode=0 Mar 20 16:10:02 crc kubenswrapper[4675]: I0320 16:10:02.993757 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" event={"ID":"c5a5c91f-fec6-410d-a21f-7cf516d89f78","Type":"ContainerDied","Data":"cbdae13faf90ce897531798cd7426f8fae027a7002aea03bc349720cc4233a99"} Mar 20 16:10:04 crc kubenswrapper[4675]: I0320 16:10:04.294195 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:04 crc kubenswrapper[4675]: I0320 16:10:04.410200 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgrrb\" (UniqueName: \"kubernetes.io/projected/c5a5c91f-fec6-410d-a21f-7cf516d89f78-kube-api-access-dgrrb\") pod \"c5a5c91f-fec6-410d-a21f-7cf516d89f78\" (UID: \"c5a5c91f-fec6-410d-a21f-7cf516d89f78\") " Mar 20 16:10:04 crc kubenswrapper[4675]: I0320 16:10:04.421247 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a5c91f-fec6-410d-a21f-7cf516d89f78-kube-api-access-dgrrb" (OuterVolumeSpecName: "kube-api-access-dgrrb") pod "c5a5c91f-fec6-410d-a21f-7cf516d89f78" (UID: "c5a5c91f-fec6-410d-a21f-7cf516d89f78"). InnerVolumeSpecName "kube-api-access-dgrrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:10:04 crc kubenswrapper[4675]: I0320 16:10:04.511805 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgrrb\" (UniqueName: \"kubernetes.io/projected/c5a5c91f-fec6-410d-a21f-7cf516d89f78-kube-api-access-dgrrb\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:05 crc kubenswrapper[4675]: I0320 16:10:05.011951 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" event={"ID":"c5a5c91f-fec6-410d-a21f-7cf516d89f78","Type":"ContainerDied","Data":"716e0b55d1d88da633029c358857a9614bf772feb62928c73c95bb5c2f11de8c"} Mar 20 16:10:05 crc kubenswrapper[4675]: I0320 16:10:05.012025 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="716e0b55d1d88da633029c358857a9614bf772feb62928c73c95bb5c2f11de8c" Mar 20 16:10:05 crc kubenswrapper[4675]: I0320 16:10:05.012050 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-5pkzv" Mar 20 16:10:05 crc kubenswrapper[4675]: I0320 16:10:05.372534 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-wld22"] Mar 20 16:10:05 crc kubenswrapper[4675]: I0320 16:10:05.380998 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-wld22"] Mar 20 16:10:06 crc kubenswrapper[4675]: I0320 16:10:06.684738 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d2332a-bd88-45d7-8645-63778001dd65" path="/var/lib/kubelet/pods/c6d2332a-bd88-45d7-8645-63778001dd65/volumes" Mar 20 16:11:04 crc kubenswrapper[4675]: I0320 16:11:04.424906 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:11:04 crc kubenswrapper[4675]: I0320 16:11:04.425411 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:11:34 crc kubenswrapper[4675]: I0320 16:11:34.425212 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:11:34 crc kubenswrapper[4675]: I0320 16:11:34.425836 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.152674 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567052-d9ghs"] Mar 20 16:12:00 crc kubenswrapper[4675]: E0320 16:12:00.154175 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a5c91f-fec6-410d-a21f-7cf516d89f78" containerName="oc" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.154208 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a5c91f-fec6-410d-a21f-7cf516d89f78" containerName="oc" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.154475 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a5c91f-fec6-410d-a21f-7cf516d89f78" containerName="oc" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.155194 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.159834 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.160719 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.164212 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.171682 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-d9ghs"] Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.292461 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ztq\" (UniqueName: \"kubernetes.io/projected/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8-kube-api-access-q6ztq\") pod \"auto-csr-approver-29567052-d9ghs\" (UID: \"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8\") " pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.393827 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ztq\" (UniqueName: \"kubernetes.io/projected/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8-kube-api-access-q6ztq\") pod \"auto-csr-approver-29567052-d9ghs\" (UID: \"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8\") " pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.415745 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ztq\" (UniqueName: \"kubernetes.io/projected/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8-kube-api-access-q6ztq\") pod \"auto-csr-approver-29567052-d9ghs\" (UID: \"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8\") " pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.485932 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.661884 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-d9ghs"] Mar 20 16:12:00 crc kubenswrapper[4675]: I0320 16:12:00.790047 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" event={"ID":"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8","Type":"ContainerStarted","Data":"f67347c1e5462adc4998116f9c568e04a1ef93a27e0239ec79a546ee184ace1b"} Mar 20 16:12:03 crc kubenswrapper[4675]: I0320 16:12:03.819008 4675 generic.go:334] "Generic (PLEG): container finished" podID="3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8" containerID="e5fbb67eea72da425118a53beb69fd39d85cd3cf4d1d9dbfb46c92cb55d56770" exitCode=0 Mar 20 16:12:03 crc kubenswrapper[4675]: I0320 16:12:03.819120 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" event={"ID":"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8","Type":"ContainerDied","Data":"e5fbb67eea72da425118a53beb69fd39d85cd3cf4d1d9dbfb46c92cb55d56770"} Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.424237 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.424290 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.424330 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.424906 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c758d44584e0697febf8722629cecc7107d8ebc056ab2b31afed6d9fd7d43fbf"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.424966 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://c758d44584e0697febf8722629cecc7107d8ebc056ab2b31afed6d9fd7d43fbf" gracePeriod=600 Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.830412 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="c758d44584e0697febf8722629cecc7107d8ebc056ab2b31afed6d9fd7d43fbf" exitCode=0 Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.830457 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"c758d44584e0697febf8722629cecc7107d8ebc056ab2b31afed6d9fd7d43fbf"} Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.830846 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"3892cdfbf0fe325e0457d2710b051a0a4b16edb3528b76523a2a7e168e5d772b"} Mar 20 16:12:04 crc kubenswrapper[4675]: I0320 16:12:04.830874 4675 scope.go:117] "RemoveContainer" containerID="67b506092b500b63a2b5d18168ce40f7a503401a387fe638308c8230ccbef555" Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.040115 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.158735 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6ztq\" (UniqueName: \"kubernetes.io/projected/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8-kube-api-access-q6ztq\") pod \"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8\" (UID: \"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8\") " Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.163331 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8-kube-api-access-q6ztq" (OuterVolumeSpecName: "kube-api-access-q6ztq") pod "3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8" (UID: "3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8"). InnerVolumeSpecName "kube-api-access-q6ztq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.260521 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6ztq\" (UniqueName: \"kubernetes.io/projected/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8-kube-api-access-q6ztq\") on node \"crc\" DevicePath \"\"" Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.845384 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" event={"ID":"3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8","Type":"ContainerDied","Data":"f67347c1e5462adc4998116f9c568e04a1ef93a27e0239ec79a546ee184ace1b"} Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.845431 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f67347c1e5462adc4998116f9c568e04a1ef93a27e0239ec79a546ee184ace1b" Mar 20 16:12:05 crc kubenswrapper[4675]: I0320 16:12:05.845480 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-d9ghs" Mar 20 16:12:06 crc kubenswrapper[4675]: I0320 16:12:06.112043 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-mvnw2"] Mar 20 16:12:06 crc kubenswrapper[4675]: I0320 16:12:06.118810 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-mvnw2"] Mar 20 16:12:06 crc kubenswrapper[4675]: I0320 16:12:06.684799 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6" path="/var/lib/kubelet/pods/9aa4bf42-3c65-48fb-99c4-0c6d9c2badc6/volumes" Mar 20 16:12:35 crc kubenswrapper[4675]: I0320 16:12:35.929739 4675 scope.go:117] "RemoveContainer" containerID="9364185492eac3fab8536dff3ddb647c52012925cb9e2d0f978b272cc38088fa" Mar 20 16:12:35 crc kubenswrapper[4675]: I0320 16:12:35.964677 4675 scope.go:117] "RemoveContainer" containerID="99e087d316e575725ebee1ad8e9129cd24309a60c1b113f3efe910b1e8259618" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.143625 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567054-vr2qs"] Mar 20 16:14:00 crc kubenswrapper[4675]: E0320 16:14:00.144318 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8" containerName="oc" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.144333 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8" containerName="oc" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.144469 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8" containerName="oc" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.144852 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.147306 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.149447 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.149731 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.152681 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-vr2qs"] Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.210042 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdms\" (UniqueName: \"kubernetes.io/projected/bb90f724-cdb5-440c-885c-2933b81bd258-kube-api-access-zwdms\") pod \"auto-csr-approver-29567054-vr2qs\" (UID: \"bb90f724-cdb5-440c-885c-2933b81bd258\") " pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.310714 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdms\" (UniqueName: \"kubernetes.io/projected/bb90f724-cdb5-440c-885c-2933b81bd258-kube-api-access-zwdms\") pod \"auto-csr-approver-29567054-vr2qs\" (UID: \"bb90f724-cdb5-440c-885c-2933b81bd258\") " pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.328268 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdms\" (UniqueName: \"kubernetes.io/projected/bb90f724-cdb5-440c-885c-2933b81bd258-kube-api-access-zwdms\") pod \"auto-csr-approver-29567054-vr2qs\" (UID: \"bb90f724-cdb5-440c-885c-2933b81bd258\") " pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.508746 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:00 crc kubenswrapper[4675]: I0320 16:14:00.947068 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-vr2qs"] Mar 20 16:14:00 crc kubenswrapper[4675]: W0320 16:14:00.959708 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb90f724_cdb5_440c_885c_2933b81bd258.slice/crio-5967b4473e2c7a13b29adbc789896ba6777ebb48f27805439b313617c5c7fb36 WatchSource:0}: Error finding container 5967b4473e2c7a13b29adbc789896ba6777ebb48f27805439b313617c5c7fb36: Status 404 returned error can't find the container with id 5967b4473e2c7a13b29adbc789896ba6777ebb48f27805439b313617c5c7fb36 Mar 20 16:14:01 crc kubenswrapper[4675]: I0320 16:14:01.664741 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" event={"ID":"bb90f724-cdb5-440c-885c-2933b81bd258","Type":"ContainerStarted","Data":"5967b4473e2c7a13b29adbc789896ba6777ebb48f27805439b313617c5c7fb36"} Mar 20 16:14:03 crc kubenswrapper[4675]: I0320 16:14:03.679438 4675 generic.go:334] "Generic (PLEG): container finished" podID="bb90f724-cdb5-440c-885c-2933b81bd258" containerID="46da6bc5cd6b0d528789f8622586a747fc5ba1123258d1379dc1ae6321c4a71f" exitCode=0 Mar 20 16:14:03 crc kubenswrapper[4675]: I0320 16:14:03.679556 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" event={"ID":"bb90f724-cdb5-440c-885c-2933b81bd258","Type":"ContainerDied","Data":"46da6bc5cd6b0d528789f8622586a747fc5ba1123258d1379dc1ae6321c4a71f"} Mar 20 16:14:04 crc kubenswrapper[4675]: I0320 16:14:04.424413 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:14:04 crc kubenswrapper[4675]: I0320 16:14:04.424487 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:14:04 crc kubenswrapper[4675]: I0320 16:14:04.889679 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:04 crc kubenswrapper[4675]: I0320 16:14:04.980047 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwdms\" (UniqueName: \"kubernetes.io/projected/bb90f724-cdb5-440c-885c-2933b81bd258-kube-api-access-zwdms\") pod \"bb90f724-cdb5-440c-885c-2933b81bd258\" (UID: \"bb90f724-cdb5-440c-885c-2933b81bd258\") " Mar 20 16:14:04 crc kubenswrapper[4675]: I0320 16:14:04.989103 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb90f724-cdb5-440c-885c-2933b81bd258-kube-api-access-zwdms" (OuterVolumeSpecName: "kube-api-access-zwdms") pod "bb90f724-cdb5-440c-885c-2933b81bd258" (UID: "bb90f724-cdb5-440c-885c-2933b81bd258"). InnerVolumeSpecName "kube-api-access-zwdms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:14:05 crc kubenswrapper[4675]: I0320 16:14:05.081703 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwdms\" (UniqueName: \"kubernetes.io/projected/bb90f724-cdb5-440c-885c-2933b81bd258-kube-api-access-zwdms\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:05 crc kubenswrapper[4675]: I0320 16:14:05.692558 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" event={"ID":"bb90f724-cdb5-440c-885c-2933b81bd258","Type":"ContainerDied","Data":"5967b4473e2c7a13b29adbc789896ba6777ebb48f27805439b313617c5c7fb36"} Mar 20 16:14:05 crc kubenswrapper[4675]: I0320 16:14:05.692617 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5967b4473e2c7a13b29adbc789896ba6777ebb48f27805439b313617c5c7fb36" Mar 20 16:14:05 crc kubenswrapper[4675]: I0320 16:14:05.692674 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-vr2qs" Mar 20 16:14:05 crc kubenswrapper[4675]: I0320 16:14:05.971959 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-2ct4l"] Mar 20 16:14:05 crc kubenswrapper[4675]: I0320 16:14:05.985707 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-2ct4l"] Mar 20 16:14:06 crc kubenswrapper[4675]: I0320 16:14:06.684070 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7176cd-8e2e-4a66-81e4-41b8502d8813" path="/var/lib/kubelet/pods/0e7176cd-8e2e-4a66-81e4-41b8502d8813/volumes" Mar 20 16:14:34 crc kubenswrapper[4675]: I0320 16:14:34.424377 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:14:34 crc kubenswrapper[4675]: I0320 16:14:34.425133 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:14:36 crc kubenswrapper[4675]: I0320 16:14:36.032324 4675 scope.go:117] "RemoveContainer" containerID="c11c751a61ba3e0c414d0610c62e3dea1dfe20836050eabe5667db0f7d31844a" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.141780 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj"] Mar 20 16:15:00 crc kubenswrapper[4675]: E0320 16:15:00.142425 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb90f724-cdb5-440c-885c-2933b81bd258" containerName="oc" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.142436 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb90f724-cdb5-440c-885c-2933b81bd258" containerName="oc" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.142534 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb90f724-cdb5-440c-885c-2933b81bd258" containerName="oc" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.142872 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.144930 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.145063 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.153317 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj"] Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.187619 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6fe470-80b6-4726-b5b8-a70ca6901685-config-volume\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.187670 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6fe470-80b6-4726-b5b8-a70ca6901685-secret-volume\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.187705 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsl7l\" (UniqueName: \"kubernetes.io/projected/cc6fe470-80b6-4726-b5b8-a70ca6901685-kube-api-access-lsl7l\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.288532 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6fe470-80b6-4726-b5b8-a70ca6901685-config-volume\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.288872 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6fe470-80b6-4726-b5b8-a70ca6901685-secret-volume\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.288922 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsl7l\" (UniqueName: \"kubernetes.io/projected/cc6fe470-80b6-4726-b5b8-a70ca6901685-kube-api-access-lsl7l\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.289727 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6fe470-80b6-4726-b5b8-a70ca6901685-config-volume\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.299707 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6fe470-80b6-4726-b5b8-a70ca6901685-secret-volume\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.308121 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsl7l\" (UniqueName: \"kubernetes.io/projected/cc6fe470-80b6-4726-b5b8-a70ca6901685-kube-api-access-lsl7l\") pod \"collect-profiles-29567055-bqwbj\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.463704 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:00 crc kubenswrapper[4675]: I0320 16:15:00.640864 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj"] Mar 20 16:15:01 crc kubenswrapper[4675]: I0320 16:15:01.081360 4675 generic.go:334] "Generic (PLEG): container finished" podID="cc6fe470-80b6-4726-b5b8-a70ca6901685" containerID="6185954bac2c2eae8d2903b967d05bcf6d2cd541e24378c805d0b4e5fc83d36e" exitCode=0 Mar 20 16:15:01 crc kubenswrapper[4675]: I0320 16:15:01.081458 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" event={"ID":"cc6fe470-80b6-4726-b5b8-a70ca6901685","Type":"ContainerDied","Data":"6185954bac2c2eae8d2903b967d05bcf6d2cd541e24378c805d0b4e5fc83d36e"} Mar 20 16:15:01 crc kubenswrapper[4675]: I0320 16:15:01.081887 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" event={"ID":"cc6fe470-80b6-4726-b5b8-a70ca6901685","Type":"ContainerStarted","Data":"0ba2f968ec2f79530fbf06f895545cd7e007598c83b684428bd66ece3295733b"} Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.325410 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.514062 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsl7l\" (UniqueName: \"kubernetes.io/projected/cc6fe470-80b6-4726-b5b8-a70ca6901685-kube-api-access-lsl7l\") pod \"cc6fe470-80b6-4726-b5b8-a70ca6901685\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.514138 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6fe470-80b6-4726-b5b8-a70ca6901685-config-volume\") pod \"cc6fe470-80b6-4726-b5b8-a70ca6901685\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.514199 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6fe470-80b6-4726-b5b8-a70ca6901685-secret-volume\") pod \"cc6fe470-80b6-4726-b5b8-a70ca6901685\" (UID: \"cc6fe470-80b6-4726-b5b8-a70ca6901685\") " Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.514993 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6fe470-80b6-4726-b5b8-a70ca6901685-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc6fe470-80b6-4726-b5b8-a70ca6901685" (UID: "cc6fe470-80b6-4726-b5b8-a70ca6901685"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.519865 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6fe470-80b6-4726-b5b8-a70ca6901685-kube-api-access-lsl7l" (OuterVolumeSpecName: "kube-api-access-lsl7l") pod "cc6fe470-80b6-4726-b5b8-a70ca6901685" (UID: "cc6fe470-80b6-4726-b5b8-a70ca6901685"). InnerVolumeSpecName "kube-api-access-lsl7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.520329 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6fe470-80b6-4726-b5b8-a70ca6901685-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc6fe470-80b6-4726-b5b8-a70ca6901685" (UID: "cc6fe470-80b6-4726-b5b8-a70ca6901685"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.615471 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsl7l\" (UniqueName: \"kubernetes.io/projected/cc6fe470-80b6-4726-b5b8-a70ca6901685-kube-api-access-lsl7l\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.615504 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6fe470-80b6-4726-b5b8-a70ca6901685-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:02 crc kubenswrapper[4675]: I0320 16:15:02.615513 4675 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6fe470-80b6-4726-b5b8-a70ca6901685-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4675]: I0320 16:15:03.096257 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" event={"ID":"cc6fe470-80b6-4726-b5b8-a70ca6901685","Type":"ContainerDied","Data":"0ba2f968ec2f79530fbf06f895545cd7e007598c83b684428bd66ece3295733b"} Mar 20 16:15:03 crc kubenswrapper[4675]: I0320 16:15:03.096313 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba2f968ec2f79530fbf06f895545cd7e007598c83b684428bd66ece3295733b" Mar 20 16:15:03 crc kubenswrapper[4675]: I0320 16:15:03.096365 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-bqwbj" Mar 20 16:15:04 crc kubenswrapper[4675]: I0320 16:15:04.424564 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:15:04 crc kubenswrapper[4675]: I0320 16:15:04.424844 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:15:04 crc kubenswrapper[4675]: I0320 16:15:04.424885 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:15:04 crc kubenswrapper[4675]: I0320 16:15:04.425425 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3892cdfbf0fe325e0457d2710b051a0a4b16edb3528b76523a2a7e168e5d772b"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:15:04 crc kubenswrapper[4675]: I0320 16:15:04.425474 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://3892cdfbf0fe325e0457d2710b051a0a4b16edb3528b76523a2a7e168e5d772b" gracePeriod=600 Mar 20 16:15:05 crc kubenswrapper[4675]: I0320 16:15:05.108381 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="3892cdfbf0fe325e0457d2710b051a0a4b16edb3528b76523a2a7e168e5d772b" exitCode=0 Mar 20 16:15:05 crc kubenswrapper[4675]: I0320 16:15:05.108456 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"3892cdfbf0fe325e0457d2710b051a0a4b16edb3528b76523a2a7e168e5d772b"} Mar 20 16:15:05 crc kubenswrapper[4675]: I0320 16:15:05.108762 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"8b8ad200c1fd09c2db80eb419a24ccfd9bc395099eb6158f7f572743729d42ad"} Mar 20 16:15:05 crc kubenswrapper[4675]: I0320 16:15:05.108812 4675 scope.go:117] "RemoveContainer" containerID="c758d44584e0697febf8722629cecc7107d8ebc056ab2b31afed6d9fd7d43fbf" Mar 20 16:15:07 crc kubenswrapper[4675]: I0320 16:15:07.539401 4675 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.452935 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-69dsg"] Mar 20 16:15:18 crc kubenswrapper[4675]: E0320 16:15:18.453680 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6fe470-80b6-4726-b5b8-a70ca6901685" containerName="collect-profiles" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.453691 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6fe470-80b6-4726-b5b8-a70ca6901685" containerName="collect-profiles" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.453831 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6fe470-80b6-4726-b5b8-a70ca6901685" containerName="collect-profiles" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.454181 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.456290 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.458024 4675 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-f9hdw" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.460420 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ndpfw"] Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.461497 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ndpfw" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.465212 4675 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7fkbl" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.465533 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.482130 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-r8sbm"] Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.483087 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.485713 4675 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-b7kb2" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.490068 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ndpfw"] Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.500598 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-r8sbm"] Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.511303 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npcwz\" (UniqueName: \"kubernetes.io/projected/1e52063a-3614-4b21-8411-d54225a3f2ed-kube-api-access-npcwz\") pod \"cert-manager-webhook-687f57d79b-r8sbm\" (UID: \"1e52063a-3614-4b21-8411-d54225a3f2ed\") " pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.511348 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btj6x\" (UniqueName: \"kubernetes.io/projected/64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a-kube-api-access-btj6x\") pod \"cert-manager-cainjector-cf98fcc89-69dsg\" (UID: \"64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.511374 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfwm5\" (UniqueName: \"kubernetes.io/projected/b4d419db-f864-4092-88c5-dd3c49f133ee-kube-api-access-zfwm5\") pod \"cert-manager-858654f9db-ndpfw\" (UID: \"b4d419db-f864-4092-88c5-dd3c49f133ee\") " pod="cert-manager/cert-manager-858654f9db-ndpfw" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.512056 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-69dsg"] Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.612472 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npcwz\" (UniqueName: \"kubernetes.io/projected/1e52063a-3614-4b21-8411-d54225a3f2ed-kube-api-access-npcwz\") pod \"cert-manager-webhook-687f57d79b-r8sbm\" (UID: \"1e52063a-3614-4b21-8411-d54225a3f2ed\") " pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.612515 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btj6x\" (UniqueName: \"kubernetes.io/projected/64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a-kube-api-access-btj6x\") pod \"cert-manager-cainjector-cf98fcc89-69dsg\" (UID: \"64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.612540 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfwm5\" (UniqueName: \"kubernetes.io/projected/b4d419db-f864-4092-88c5-dd3c49f133ee-kube-api-access-zfwm5\") pod \"cert-manager-858654f9db-ndpfw\" (UID: \"b4d419db-f864-4092-88c5-dd3c49f133ee\") " pod="cert-manager/cert-manager-858654f9db-ndpfw" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.632042 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btj6x\" (UniqueName: \"kubernetes.io/projected/64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a-kube-api-access-btj6x\") pod \"cert-manager-cainjector-cf98fcc89-69dsg\" (UID: \"64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.633513 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npcwz\" (UniqueName: \"kubernetes.io/projected/1e52063a-3614-4b21-8411-d54225a3f2ed-kube-api-access-npcwz\") pod \"cert-manager-webhook-687f57d79b-r8sbm\" (UID: \"1e52063a-3614-4b21-8411-d54225a3f2ed\") " pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.641471 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfwm5\" (UniqueName: \"kubernetes.io/projected/b4d419db-f864-4092-88c5-dd3c49f133ee-kube-api-access-zfwm5\") pod \"cert-manager-858654f9db-ndpfw\" (UID: \"b4d419db-f864-4092-88c5-dd3c49f133ee\") " pod="cert-manager/cert-manager-858654f9db-ndpfw" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.774084 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.784555 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ndpfw" Mar 20 16:15:18 crc kubenswrapper[4675]: I0320 16:15:18.795663 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:19 crc kubenswrapper[4675]: I0320 16:15:19.010514 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ndpfw"] Mar 20 16:15:19 crc kubenswrapper[4675]: W0320 16:15:19.020827 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4d419db_f864_4092_88c5_dd3c49f133ee.slice/crio-3612e3de6a954b6f1cd0dfdd815d6b492f8c2d5dee1e7d021c9803606adf4e20 WatchSource:0}: Error finding container 3612e3de6a954b6f1cd0dfdd815d6b492f8c2d5dee1e7d021c9803606adf4e20: Status 404 returned error can't find the container with id 3612e3de6a954b6f1cd0dfdd815d6b492f8c2d5dee1e7d021c9803606adf4e20 Mar 20 16:15:19 crc kubenswrapper[4675]: I0320 16:15:19.022942 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:15:19 crc kubenswrapper[4675]: I0320 16:15:19.076951 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-r8sbm"] Mar 20 16:15:19 crc kubenswrapper[4675]: W0320 16:15:19.081000 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e52063a_3614_4b21_8411_d54225a3f2ed.slice/crio-7f93fb5b81388b5588e1144cadd8af6e6a2b6f6a6a21deb242fa9c1d662efb34 WatchSource:0}: Error finding container 7f93fb5b81388b5588e1144cadd8af6e6a2b6f6a6a21deb242fa9c1d662efb34: Status 404 returned error can't find the container with id 7f93fb5b81388b5588e1144cadd8af6e6a2b6f6a6a21deb242fa9c1d662efb34 Mar 20 16:15:19 crc kubenswrapper[4675]: I0320 16:15:19.190707 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" event={"ID":"1e52063a-3614-4b21-8411-d54225a3f2ed","Type":"ContainerStarted","Data":"7f93fb5b81388b5588e1144cadd8af6e6a2b6f6a6a21deb242fa9c1d662efb34"} Mar 20 16:15:19 crc kubenswrapper[4675]: I0320 16:15:19.191646 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ndpfw" event={"ID":"b4d419db-f864-4092-88c5-dd3c49f133ee","Type":"ContainerStarted","Data":"3612e3de6a954b6f1cd0dfdd815d6b492f8c2d5dee1e7d021c9803606adf4e20"} Mar 20 16:15:19 crc kubenswrapper[4675]: I0320 16:15:19.242349 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-69dsg"] Mar 20 16:15:19 crc kubenswrapper[4675]: W0320 16:15:19.246117 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64366dbb_dc3a_4f63_a27e_c8f8b1b2e95a.slice/crio-ae1a0da0e342ff5fcf4c6e38c5caaa29b7ce56695339aa3bd37249dc23180b1a WatchSource:0}: Error finding container ae1a0da0e342ff5fcf4c6e38c5caaa29b7ce56695339aa3bd37249dc23180b1a: Status 404 returned error can't find the container with id ae1a0da0e342ff5fcf4c6e38c5caaa29b7ce56695339aa3bd37249dc23180b1a Mar 20 16:15:20 crc kubenswrapper[4675]: I0320 16:15:20.207246 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" event={"ID":"64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a","Type":"ContainerStarted","Data":"ae1a0da0e342ff5fcf4c6e38c5caaa29b7ce56695339aa3bd37249dc23180b1a"} Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.228600 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" event={"ID":"64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a","Type":"ContainerStarted","Data":"685637673a425e4e4c2629530015c60cc55c8b1eba623284f33138d4cd6e0dda"} Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.230457 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" event={"ID":"1e52063a-3614-4b21-8411-d54225a3f2ed","Type":"ContainerStarted","Data":"36846bc0ee1ea410a35ca37c0f2d457f1f96f50b61b051191af7c9661ccb7146"} Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.230606 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.231470 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ndpfw" event={"ID":"b4d419db-f864-4092-88c5-dd3c49f133ee","Type":"ContainerStarted","Data":"5db931b8e10abb5199d18a92b5a1d31a08f086fba21fd9a53800e7f11a677f94"} Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.243478 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-69dsg" podStartSLOduration=1.673858175 podStartE2EDuration="5.24345962s" podCreationTimestamp="2026-03-20 16:15:18 +0000 UTC" firstStartedPulling="2026-03-20 16:15:19.24838686 +0000 UTC m=+839.282016397" lastFinishedPulling="2026-03-20 16:15:22.817988305 +0000 UTC m=+842.851617842" observedRunningTime="2026-03-20 16:15:23.241555385 +0000 UTC m=+843.275184922" watchObservedRunningTime="2026-03-20 16:15:23.24345962 +0000 UTC m=+843.277089157" Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.311907 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" podStartSLOduration=1.530581959 podStartE2EDuration="5.311891456s" podCreationTimestamp="2026-03-20 16:15:18 +0000 UTC" firstStartedPulling="2026-03-20 16:15:19.082907582 +0000 UTC m=+839.116537119" lastFinishedPulling="2026-03-20 16:15:22.864217039 +0000 UTC m=+842.897846616" observedRunningTime="2026-03-20 16:15:23.309139027 +0000 UTC m=+843.342768564" watchObservedRunningTime="2026-03-20 16:15:23.311891456 +0000 UTC m=+843.345520993" Mar 20 16:15:23 crc kubenswrapper[4675]: I0320 16:15:23.322411 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ndpfw" podStartSLOduration=1.526171562 podStartE2EDuration="5.32239745s" podCreationTimestamp="2026-03-20 16:15:18 +0000 UTC" firstStartedPulling="2026-03-20 16:15:19.022746385 +0000 UTC m=+839.056375922" lastFinishedPulling="2026-03-20 16:15:22.818972273 +0000 UTC m=+842.852601810" observedRunningTime="2026-03-20 16:15:23.320724041 +0000 UTC m=+843.354353588" watchObservedRunningTime="2026-03-20 16:15:23.32239745 +0000 UTC m=+843.356026977" Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.650109 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n54g5"] Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.650822 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-controller" containerID="cri-o://ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.651214 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="sbdb" containerID="cri-o://59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.651286 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="nbdb" containerID="cri-o://0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.651330 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="northd" containerID="cri-o://04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.651367 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.651408 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-node" containerID="cri-o://b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.651447 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-acl-logging" containerID="cri-o://0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.736482 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" containerID="cri-o://381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" gracePeriod=30 Mar 20 16:15:28 crc kubenswrapper[4675]: I0320 16:15:28.809518 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-r8sbm" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.017808 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/3.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.021300 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovn-acl-logging/0.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.022270 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovn-controller/0.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.022919 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085096 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qkw9g"] Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085299 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-node" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085311 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-node" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085320 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085326 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085334 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085341 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085348 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085354 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085362 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085367 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085376 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kubecfg-setup" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085381 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kubecfg-setup" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085390 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="northd" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085396 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="northd" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085405 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085411 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085423 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-acl-logging" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085428 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-acl-logging" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085434 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="nbdb" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085439 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="nbdb" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085447 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="sbdb" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085453 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="sbdb" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085548 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-acl-logging" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085556 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085566 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085574 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-node" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085582 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085588 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085596 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovn-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085602 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="nbdb" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085608 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="sbdb" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085618 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="northd" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085702 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085709 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.085718 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085723 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085850 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.085859 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="467da034-edb5-4a24-a940-839cc0131c75" containerName="ovnkube-controller" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.088249 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.188838 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467da034-edb5-4a24-a940-839cc0131c75-ovn-node-metrics-cert\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.188922 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-openvswitch\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.188976 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-slash\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189007 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-netns\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189060 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-log-socket\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189088 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-systemd\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189056 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189116 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-ovn-kubernetes\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189102 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-slash" (OuterVolumeSpecName: "host-slash") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189092 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189124 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-log-socket" (OuterVolumeSpecName: "log-socket") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189148 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-node-log\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189185 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-node-log" (OuterVolumeSpecName: "node-log") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189164 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189247 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-config\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189304 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-env-overrides\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189357 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-script-lib\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189386 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-etc-openvswitch\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189449 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-ovn\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189478 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjgt\" (UniqueName: \"kubernetes.io/projected/467da034-edb5-4a24-a940-839cc0131c75-kube-api-access-dqjgt\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189497 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-netd\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189532 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189556 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-systemd-units\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189577 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-bin\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189597 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-kubelet\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189618 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-var-lib-openvswitch\") pod \"467da034-edb5-4a24-a940-839cc0131c75\" (UID: \"467da034-edb5-4a24-a940-839cc0131c75\") " Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189660 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189689 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189712 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189760 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-log-socket\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189819 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-ovnkube-config\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189840 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189876 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189845 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-slash\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189899 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189918 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189918 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-etc-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189942 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189954 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.189973 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-systemd-units\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190000 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190044 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-node-log\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190068 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464ff\" (UniqueName: \"kubernetes.io/projected/524b7506-f57b-40bd-8450-4dc14a58b425-kube-api-access-464ff\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190115 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190135 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-var-lib-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190156 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-kubelet\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190188 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-run-netns\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190214 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-cni-netd\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190238 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-run-ovn-kubernetes\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190276 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-env-overrides\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190297 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524b7506-f57b-40bd-8450-4dc14a58b425-ovn-node-metrics-cert\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190327 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-cni-bin\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190375 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-ovnkube-script-lib\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190394 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190432 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-systemd\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190447 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-ovn\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190482 4675 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190511 4675 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190519 4675 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190528 4675 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190538 4675 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190546 4675 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190553 4675 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190560 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190587 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190789 4675 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190802 4675 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190811 4675 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190844 4675 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190855 4675 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190863 4675 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.190883 4675 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.194306 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467da034-edb5-4a24-a940-839cc0131c75-kube-api-access-dqjgt" (OuterVolumeSpecName: "kube-api-access-dqjgt") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "kube-api-access-dqjgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.194595 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467da034-edb5-4a24-a940-839cc0131c75-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.195719 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.210361 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "467da034-edb5-4a24-a940-839cc0131c75" (UID: "467da034-edb5-4a24-a940-839cc0131c75"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.275755 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovnkube-controller/3.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.278751 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovn-acl-logging/0.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279229 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n54g5_467da034-edb5-4a24-a940-839cc0131c75/ovn-controller/0.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279605 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" exitCode=0 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279635 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" exitCode=0 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279644 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" exitCode=0 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279637 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279692 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279719 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279732 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279654 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" exitCode=0 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279747 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" exitCode=0 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279752 4675 scope.go:117] "RemoveContainer" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279778 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279833 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279755 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" exitCode=0 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279858 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279917 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279931 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279938 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279946 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279952 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279959 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279966 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279971 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279864 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" exitCode=143 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280044 4675 generic.go:334] "Generic (PLEG): container finished" podID="467da034-edb5-4a24-a940-839cc0131c75" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" exitCode=143 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.279978 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280112 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280147 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280165 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280173 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280180 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280188 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280195 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280203 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280210 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280217 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280224 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280234 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280245 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280254 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280261 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280269 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280276 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280282 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280289 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280296 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280302 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280307 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280315 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n54g5" event={"ID":"467da034-edb5-4a24-a940-839cc0131c75","Type":"ContainerDied","Data":"3c07fae4f4f4fff6a2f6afc73bb9f06aee2c0f4c46bbd2e3b42afdc96ff9191f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280323 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280330 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280335 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280340 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280345 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280349 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280354 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280359 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280364 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.280370 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.283825 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/1.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.284692 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/0.log" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.284799 4675 generic.go:334] "Generic (PLEG): container finished" podID="7d530666-72d8-4520-a229-43eab240e5dd" containerID="e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4" exitCode=2 Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.284843 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvqmz" event={"ID":"7d530666-72d8-4520-a229-43eab240e5dd","Type":"ContainerDied","Data":"e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.284878 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d"} Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.285535 4675 scope.go:117] "RemoveContainer" containerID="e82a0e511d132bb4fb021ce96cf78faa3f3fa750ea608b9944037026053c56c4" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.297850 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-run-ovn-kubernetes\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.297894 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-env-overrides\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.297915 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524b7506-f57b-40bd-8450-4dc14a58b425-ovn-node-metrics-cert\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298041 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-cni-bin\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298062 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-ovnkube-script-lib\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.297912 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-run-ovn-kubernetes\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298121 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298085 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298175 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-cni-bin\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298226 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-systemd\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298410 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-env-overrides\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298534 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-ovn\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298596 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-systemd\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298601 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-ovn\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298661 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-log-socket\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298689 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-log-socket\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298718 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-ovnkube-config\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298739 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-slash\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298759 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-etc-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298795 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-ovnkube-script-lib\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298825 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-systemd-units\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298854 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-slash\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298868 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-node-log\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298881 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-systemd-units\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298890 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-node-log\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298859 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-etc-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298891 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-464ff\" (UniqueName: \"kubernetes.io/projected/524b7506-f57b-40bd-8450-4dc14a58b425-kube-api-access-464ff\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298939 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298959 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-var-lib-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298979 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-kubelet\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.298995 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-run-netns\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299012 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-cni-netd\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299046 4675 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467da034-edb5-4a24-a940-839cc0131c75-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299056 4675 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467da034-edb5-4a24-a940-839cc0131c75-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299066 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqjgt\" (UniqueName: \"kubernetes.io/projected/467da034-edb5-4a24-a940-839cc0131c75-kube-api-access-dqjgt\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299075 4675 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467da034-edb5-4a24-a940-839cc0131c75-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299098 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-cni-netd\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299118 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-run-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299147 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-var-lib-openvswitch\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299168 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-run-netns\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299176 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524b7506-f57b-40bd-8450-4dc14a58b425-host-kubelet\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.299644 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524b7506-f57b-40bd-8450-4dc14a58b425-ovnkube-config\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.304539 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524b7506-f57b-40bd-8450-4dc14a58b425-ovn-node-metrics-cert\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.304695 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.321510 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-464ff\" (UniqueName: \"kubernetes.io/projected/524b7506-f57b-40bd-8450-4dc14a58b425-kube-api-access-464ff\") pod \"ovnkube-node-qkw9g\" (UID: \"524b7506-f57b-40bd-8450-4dc14a58b425\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.323659 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n54g5"] Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.328815 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n54g5"] Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.345420 4675 scope.go:117] "RemoveContainer" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.358328 4675 scope.go:117] "RemoveContainer" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.392998 4675 scope.go:117] "RemoveContainer" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.405354 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.412092 4675 scope.go:117] "RemoveContainer" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.438855 4675 scope.go:117] "RemoveContainer" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.460301 4675 scope.go:117] "RemoveContainer" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.488160 4675 scope.go:117] "RemoveContainer" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.512367 4675 scope.go:117] "RemoveContainer" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.525901 4675 scope.go:117] "RemoveContainer" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.526536 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": container with ID starting with 381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b not found: ID does not exist" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.526570 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} err="failed to get container status \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": rpc error: code = NotFound desc = could not find container \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": container with ID starting with 381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.526593 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.527088 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": container with ID starting with fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492 not found: ID does not exist" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.527114 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} err="failed to get container status \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": rpc error: code = NotFound desc = could not find container \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": container with ID starting with fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.527131 4675 scope.go:117] "RemoveContainer" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.527563 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": container with ID starting with 59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f not found: ID does not exist" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.527585 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} err="failed to get container status \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": rpc error: code = NotFound desc = could not find container \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": container with ID starting with 59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.527598 4675 scope.go:117] "RemoveContainer" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.527876 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": container with ID starting with 0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a not found: ID does not exist" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.527923 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} err="failed to get container status \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": rpc error: code = NotFound desc = could not find container \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": container with ID starting with 0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.527965 4675 scope.go:117] "RemoveContainer" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.528252 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": container with ID starting with 04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab not found: ID does not exist" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.528281 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} err="failed to get container status \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": rpc error: code = NotFound desc = could not find container \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": container with ID starting with 04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.528297 4675 scope.go:117] "RemoveContainer" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.528680 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": container with ID starting with 12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf not found: ID does not exist" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.528701 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} err="failed to get container status \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": rpc error: code = NotFound desc = could not find container \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": container with ID starting with 12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.528729 4675 scope.go:117] "RemoveContainer" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.529009 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": container with ID starting with b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b not found: ID does not exist" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529031 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} err="failed to get container status \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": rpc error: code = NotFound desc = could not find container \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": container with ID starting with b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529043 4675 scope.go:117] "RemoveContainer" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.529323 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": container with ID starting with 0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f not found: ID does not exist" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529343 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} err="failed to get container status \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": rpc error: code = NotFound desc = could not find container \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": container with ID starting with 0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529371 4675 scope.go:117] "RemoveContainer" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.529569 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": container with ID starting with ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304 not found: ID does not exist" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529589 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} err="failed to get container status \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": rpc error: code = NotFound desc = could not find container \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": container with ID starting with ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529602 4675 scope.go:117] "RemoveContainer" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" Mar 20 16:15:29 crc kubenswrapper[4675]: E0320 16:15:29.529828 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": container with ID starting with 86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3 not found: ID does not exist" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529846 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} err="failed to get container status \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": rpc error: code = NotFound desc = could not find container \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": container with ID starting with 86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.529884 4675 scope.go:117] "RemoveContainer" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.530069 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} err="failed to get container status \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": rpc error: code = NotFound desc = could not find container \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": container with ID starting with 381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.530089 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.530561 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} err="failed to get container status \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": rpc error: code = NotFound desc = could not find container \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": container with ID starting with fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.530579 4675 scope.go:117] "RemoveContainer" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.531526 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} err="failed to get container status \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": rpc error: code = NotFound desc = could not find container \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": container with ID starting with 59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.531546 4675 scope.go:117] "RemoveContainer" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.531832 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} err="failed to get container status \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": rpc error: code = NotFound desc = could not find container \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": container with ID starting with 0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.531852 4675 scope.go:117] "RemoveContainer" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532088 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} err="failed to get container status \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": rpc error: code = NotFound desc = could not find container \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": container with ID starting with 04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532128 4675 scope.go:117] "RemoveContainer" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532305 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} err="failed to get container status \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": rpc error: code = NotFound desc = could not find container \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": container with ID starting with 12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532323 4675 scope.go:117] "RemoveContainer" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532499 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} err="failed to get container status \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": rpc error: code = NotFound desc = could not find container \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": container with ID starting with b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532516 4675 scope.go:117] "RemoveContainer" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532684 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} err="failed to get container status \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": rpc error: code = NotFound desc = could not find container \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": container with ID starting with 0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.532913 4675 scope.go:117] "RemoveContainer" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533288 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} err="failed to get container status \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": rpc error: code = NotFound desc = could not find container \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": container with ID starting with ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533307 4675 scope.go:117] "RemoveContainer" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533505 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} err="failed to get container status \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": rpc error: code = NotFound desc = could not find container \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": container with ID starting with 86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533526 4675 scope.go:117] "RemoveContainer" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533674 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} err="failed to get container status \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": rpc error: code = NotFound desc = could not find container \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": container with ID starting with 381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533689 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533843 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} err="failed to get container status \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": rpc error: code = NotFound desc = could not find container \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": container with ID starting with fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533858 4675 scope.go:117] "RemoveContainer" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.533995 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} err="failed to get container status \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": rpc error: code = NotFound desc = could not find container \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": container with ID starting with 59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534011 4675 scope.go:117] "RemoveContainer" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534143 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} err="failed to get container status \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": rpc error: code = NotFound desc = could not find container \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": container with ID starting with 0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534157 4675 scope.go:117] "RemoveContainer" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534278 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} err="failed to get container status \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": rpc error: code = NotFound desc = could not find container \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": container with ID starting with 04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534292 4675 scope.go:117] "RemoveContainer" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534414 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} err="failed to get container status \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": rpc error: code = NotFound desc = could not find container \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": container with ID starting with 12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534428 4675 scope.go:117] "RemoveContainer" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534544 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} err="failed to get container status \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": rpc error: code = NotFound desc = could not find container \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": container with ID starting with b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534560 4675 scope.go:117] "RemoveContainer" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534729 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} err="failed to get container status \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": rpc error: code = NotFound desc = could not find container \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": container with ID starting with 0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534744 4675 scope.go:117] "RemoveContainer" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534902 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} err="failed to get container status \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": rpc error: code = NotFound desc = could not find container \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": container with ID starting with ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.534914 4675 scope.go:117] "RemoveContainer" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535084 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} err="failed to get container status \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": rpc error: code = NotFound desc = could not find container \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": container with ID starting with 86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535098 4675 scope.go:117] "RemoveContainer" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535226 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} err="failed to get container status \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": rpc error: code = NotFound desc = could not find container \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": container with ID starting with 381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535241 4675 scope.go:117] "RemoveContainer" containerID="fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535619 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492"} err="failed to get container status \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": rpc error: code = NotFound desc = could not find container \"fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492\": container with ID starting with fa71b8c115972c6822f0e16c2093c1c6ecfa139c91a7c2148de9e546df909492 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535684 4675 scope.go:117] "RemoveContainer" containerID="59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535882 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f"} err="failed to get container status \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": rpc error: code = NotFound desc = could not find container \"59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f\": container with ID starting with 59cee278e27e96cc718432881e97b3cb46697e7e724ed690fe9577e3b820292f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.535896 4675 scope.go:117] "RemoveContainer" containerID="0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536034 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a"} err="failed to get container status \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": rpc error: code = NotFound desc = could not find container \"0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a\": container with ID starting with 0df1aeea6d4cf83e9af3d16043cdc6c79d1844580ac0835b54d89005bb6bc04a not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536044 4675 scope.go:117] "RemoveContainer" containerID="04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536198 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab"} err="failed to get container status \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": rpc error: code = NotFound desc = could not find container \"04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab\": container with ID starting with 04ac795583f7fe335cf5c10cc8ba4dbb0fecadb4d0712df764f95bc5fdeb3bab not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536209 4675 scope.go:117] "RemoveContainer" containerID="12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536330 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf"} err="failed to get container status \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": rpc error: code = NotFound desc = could not find container \"12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf\": container with ID starting with 12bafe7378e980531f3de7f1c716ed820f3aceb1a254c6e230390d9d7e121aaf not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536342 4675 scope.go:117] "RemoveContainer" containerID="b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536510 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b"} err="failed to get container status \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": rpc error: code = NotFound desc = could not find container \"b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b\": container with ID starting with b93fdd19443c6777d672f3248a693bff6f35c3ff02970b73297246d013ee5a1b not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536526 4675 scope.go:117] "RemoveContainer" containerID="0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536664 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f"} err="failed to get container status \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": rpc error: code = NotFound desc = could not find container \"0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f\": container with ID starting with 0db34bf176847115b448702b917ac035c3aed1badca60b81d7e8ed6be759af2f not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536678 4675 scope.go:117] "RemoveContainer" containerID="ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536905 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304"} err="failed to get container status \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": rpc error: code = NotFound desc = could not find container \"ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304\": container with ID starting with ca1be06b6e6c86e921370b212480108089254b2d6f16e6462ea1748ac9308304 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.536921 4675 scope.go:117] "RemoveContainer" containerID="86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.537070 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3"} err="failed to get container status \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": rpc error: code = NotFound desc = could not find container \"86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3\": container with ID starting with 86206ea0578b66e01fc636f803b6fbc6a1aa6a3b5fe75d0c7128c2ccb41570e3 not found: ID does not exist" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.537085 4675 scope.go:117] "RemoveContainer" containerID="381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b" Mar 20 16:15:29 crc kubenswrapper[4675]: I0320 16:15:29.537226 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b"} err="failed to get container status \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": rpc error: code = NotFound desc = could not find container \"381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b\": container with ID starting with 381facd990bb85a307546b323fa6b738b240d0dd8c61d87de8898863ec06563b not found: ID does not exist" Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.299723 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/1.log" Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.300333 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/0.log" Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.300433 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvqmz" event={"ID":"7d530666-72d8-4520-a229-43eab240e5dd","Type":"ContainerStarted","Data":"e612a502ea9c756cafb342c4ecc7db7813d46f3f498a419ba8498e589f8a2ca9"} Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.304972 4675 generic.go:334] "Generic (PLEG): container finished" podID="524b7506-f57b-40bd-8450-4dc14a58b425" containerID="4f9ee21fac8f1268ff6128625c84900051938030d072778f5d2c808c026f1fc4" exitCode=0 Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.305041 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerDied","Data":"4f9ee21fac8f1268ff6128625c84900051938030d072778f5d2c808c026f1fc4"} Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.305086 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"6aec6f6ada550d8ef4c800217bc3813550a3eba294130aa030b54bd60765fad7"} Mar 20 16:15:30 crc kubenswrapper[4675]: I0320 16:15:30.680535 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467da034-edb5-4a24-a940-839cc0131c75" path="/var/lib/kubelet/pods/467da034-edb5-4a24-a940-839cc0131c75/volumes" Mar 20 16:15:31 crc kubenswrapper[4675]: I0320 16:15:31.315071 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"97213f67f095d956f8fd20b521c2591cd0d28aed6e393c609a1a68973373c02f"} Mar 20 16:15:31 crc kubenswrapper[4675]: I0320 16:15:31.315115 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"b5537cfc32e3a3d6ed8f1c8d553bee56194b7e72dafde4fd1caa104a164ea9c5"} Mar 20 16:15:31 crc kubenswrapper[4675]: I0320 16:15:31.315128 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"bbce0b4a0983eed87eadb24ae8a0e28f312a20cb5a6e95819031d0f881e2a4dd"} Mar 20 16:15:31 crc kubenswrapper[4675]: I0320 16:15:31.315139 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"611eaef7deb698f353c90981b339f406a7ab21e4b310b52f82883a8b47ce410c"} Mar 20 16:15:31 crc kubenswrapper[4675]: I0320 16:15:31.315152 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"d726ef3c4f1e2497a51ae38940aa8203687757a0256261bc3630ac1eb2d76654"} Mar 20 16:15:31 crc kubenswrapper[4675]: I0320 16:15:31.315163 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"d156bce2687d059b0c4da8595251415c30aca2cbb7faee593bea99feca248074"} Mar 20 16:15:33 crc kubenswrapper[4675]: I0320 16:15:33.332782 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"1ea1002c7e5a443823d021ea167e0e97eaf6c16e6a7d8b43dae01b06d6840a0d"} Mar 20 16:15:36 crc kubenswrapper[4675]: I0320 16:15:36.095990 4675 scope.go:117] "RemoveContainer" containerID="457a671934a9fd1639cc05e402f1e9fb4cd6d848019f0aeacfc5dc1c9626c39d" Mar 20 16:15:36 crc kubenswrapper[4675]: I0320 16:15:36.354583 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" event={"ID":"524b7506-f57b-40bd-8450-4dc14a58b425","Type":"ContainerStarted","Data":"e236723a52409268d8fd0a4d5fda8f36082565b4949c3774803eecb15f9a5e50"} Mar 20 16:15:36 crc kubenswrapper[4675]: I0320 16:15:36.386610 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" podStartSLOduration=7.386582801 podStartE2EDuration="7.386582801s" podCreationTimestamp="2026-03-20 16:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:15:36.383690739 +0000 UTC m=+856.417320316" watchObservedRunningTime="2026-03-20 16:15:36.386582801 +0000 UTC m=+856.420212378" Mar 20 16:15:37 crc kubenswrapper[4675]: I0320 16:15:37.361261 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvqmz_7d530666-72d8-4520-a229-43eab240e5dd/kube-multus/1.log" Mar 20 16:15:37 crc kubenswrapper[4675]: I0320 16:15:37.362935 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:37 crc kubenswrapper[4675]: I0320 16:15:37.362978 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:37 crc kubenswrapper[4675]: I0320 16:15:37.362994 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:37 crc kubenswrapper[4675]: I0320 16:15:37.387121 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:37 crc kubenswrapper[4675]: I0320 16:15:37.394124 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.491061 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rmjwn"] Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.492845 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.515240 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmjwn"] Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.618089 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqcvq\" (UniqueName: \"kubernetes.io/projected/e2461648-2cc2-4b16-9e85-5d6a3af9b072-kube-api-access-mqcvq\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.618212 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-utilities\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.618302 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-catalog-content\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.719167 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-catalog-content\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.719716 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqcvq\" (UniqueName: \"kubernetes.io/projected/e2461648-2cc2-4b16-9e85-5d6a3af9b072-kube-api-access-mqcvq\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.719734 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-utilities\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.719631 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-catalog-content\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.720285 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-utilities\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.740978 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqcvq\" (UniqueName: \"kubernetes.io/projected/e2461648-2cc2-4b16-9e85-5d6a3af9b072-kube-api-access-mqcvq\") pod \"certified-operators-rmjwn\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: I0320 16:15:38.820463 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: E0320 16:15:38.854954 4675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-rmjwn_openshift-marketplace_e2461648-2cc2-4b16-9e85-5d6a3af9b072_0(7a9494723c3ed007d1e1532babab617f546b8aedc073a7618fbb5e41c23ba0a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:15:38 crc kubenswrapper[4675]: E0320 16:15:38.855036 4675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-rmjwn_openshift-marketplace_e2461648-2cc2-4b16-9e85-5d6a3af9b072_0(7a9494723c3ed007d1e1532babab617f546b8aedc073a7618fbb5e41c23ba0a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: E0320 16:15:38.855061 4675 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-rmjwn_openshift-marketplace_e2461648-2cc2-4b16-9e85-5d6a3af9b072_0(7a9494723c3ed007d1e1532babab617f546b8aedc073a7618fbb5e41c23ba0a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:38 crc kubenswrapper[4675]: E0320 16:15:38.855127 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-rmjwn_openshift-marketplace(e2461648-2cc2-4b16-9e85-5d6a3af9b072)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-rmjwn_openshift-marketplace(e2461648-2cc2-4b16-9e85-5d6a3af9b072)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-rmjwn_openshift-marketplace_e2461648-2cc2-4b16-9e85-5d6a3af9b072_0(7a9494723c3ed007d1e1532babab617f546b8aedc073a7618fbb5e41c23ba0a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-rmjwn" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" Mar 20 16:15:39 crc kubenswrapper[4675]: I0320 16:15:39.372810 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:39 crc kubenswrapper[4675]: I0320 16:15:39.373696 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:39 crc kubenswrapper[4675]: I0320 16:15:39.555364 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmjwn"] Mar 20 16:15:39 crc kubenswrapper[4675]: W0320 16:15:39.562057 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2461648_2cc2_4b16_9e85_5d6a3af9b072.slice/crio-2f9b6c27c605da4b9e8809cac11e2765152f8e4e6bd4027559b81f540140dc0e WatchSource:0}: Error finding container 2f9b6c27c605da4b9e8809cac11e2765152f8e4e6bd4027559b81f540140dc0e: Status 404 returned error can't find the container with id 2f9b6c27c605da4b9e8809cac11e2765152f8e4e6bd4027559b81f540140dc0e Mar 20 16:15:40 crc kubenswrapper[4675]: I0320 16:15:40.389315 4675 generic.go:334] "Generic (PLEG): container finished" podID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerID="1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5" exitCode=0 Mar 20 16:15:40 crc kubenswrapper[4675]: I0320 16:15:40.389428 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerDied","Data":"1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5"} Mar 20 16:15:40 crc kubenswrapper[4675]: I0320 16:15:40.389761 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerStarted","Data":"2f9b6c27c605da4b9e8809cac11e2765152f8e4e6bd4027559b81f540140dc0e"} Mar 20 16:15:41 crc kubenswrapper[4675]: I0320 16:15:41.396213 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerStarted","Data":"39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435"} Mar 20 16:15:42 crc kubenswrapper[4675]: I0320 16:15:42.402011 4675 generic.go:334] "Generic (PLEG): container finished" podID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerID="39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435" exitCode=0 Mar 20 16:15:42 crc kubenswrapper[4675]: I0320 16:15:42.402070 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerDied","Data":"39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435"} Mar 20 16:15:43 crc kubenswrapper[4675]: I0320 16:15:43.411332 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerStarted","Data":"354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e"} Mar 20 16:15:43 crc kubenswrapper[4675]: I0320 16:15:43.436652 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rmjwn" podStartSLOduration=3.01121544 podStartE2EDuration="5.436629552s" podCreationTimestamp="2026-03-20 16:15:38 +0000 UTC" firstStartedPulling="2026-03-20 16:15:40.394281048 +0000 UTC m=+860.427910625" lastFinishedPulling="2026-03-20 16:15:42.81969517 +0000 UTC m=+862.853324737" observedRunningTime="2026-03-20 16:15:43.434420109 +0000 UTC m=+863.468049686" watchObservedRunningTime="2026-03-20 16:15:43.436629552 +0000 UTC m=+863.470259099" Mar 20 16:15:45 crc kubenswrapper[4675]: I0320 16:15:45.886710 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qw47k"] Mar 20 16:15:45 crc kubenswrapper[4675]: I0320 16:15:45.888449 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:45 crc kubenswrapper[4675]: I0320 16:15:45.891726 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qw47k"] Mar 20 16:15:45 crc kubenswrapper[4675]: I0320 16:15:45.919482 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-utilities\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:45 crc kubenswrapper[4675]: I0320 16:15:45.919528 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-catalog-content\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:45 crc kubenswrapper[4675]: I0320 16:15:45.919573 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdc7d\" (UniqueName: \"kubernetes.io/projected/432c49bc-732b-4769-b8bb-3700ee59898f-kube-api-access-rdc7d\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.034361 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-utilities\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.034434 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-catalog-content\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.034517 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdc7d\" (UniqueName: \"kubernetes.io/projected/432c49bc-732b-4769-b8bb-3700ee59898f-kube-api-access-rdc7d\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.035082 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-utilities\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.035100 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-catalog-content\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.054094 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdc7d\" (UniqueName: \"kubernetes.io/projected/432c49bc-732b-4769-b8bb-3700ee59898f-kube-api-access-rdc7d\") pod \"community-operators-qw47k\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.221021 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:46 crc kubenswrapper[4675]: I0320 16:15:46.687989 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qw47k"] Mar 20 16:15:47 crc kubenswrapper[4675]: I0320 16:15:47.438318 4675 generic.go:334] "Generic (PLEG): container finished" podID="432c49bc-732b-4769-b8bb-3700ee59898f" containerID="c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d" exitCode=0 Mar 20 16:15:47 crc kubenswrapper[4675]: I0320 16:15:47.438423 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qw47k" event={"ID":"432c49bc-732b-4769-b8bb-3700ee59898f","Type":"ContainerDied","Data":"c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d"} Mar 20 16:15:47 crc kubenswrapper[4675]: I0320 16:15:47.438616 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qw47k" event={"ID":"432c49bc-732b-4769-b8bb-3700ee59898f","Type":"ContainerStarted","Data":"3d103b7e6f95802521735751006f24f87112e50cad56feca9e07922fd13e70f6"} Mar 20 16:15:48 crc kubenswrapper[4675]: I0320 16:15:48.821462 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:48 crc kubenswrapper[4675]: I0320 16:15:48.822245 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:48 crc kubenswrapper[4675]: I0320 16:15:48.864410 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:49 crc kubenswrapper[4675]: I0320 16:15:49.455289 4675 generic.go:334] "Generic (PLEG): container finished" podID="432c49bc-732b-4769-b8bb-3700ee59898f" containerID="c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25" exitCode=0 Mar 20 16:15:49 crc kubenswrapper[4675]: I0320 16:15:49.455393 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qw47k" event={"ID":"432c49bc-732b-4769-b8bb-3700ee59898f","Type":"ContainerDied","Data":"c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25"} Mar 20 16:15:49 crc kubenswrapper[4675]: I0320 16:15:49.490991 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:50 crc kubenswrapper[4675]: I0320 16:15:50.461605 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmjwn"] Mar 20 16:15:50 crc kubenswrapper[4675]: I0320 16:15:50.464976 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qw47k" event={"ID":"432c49bc-732b-4769-b8bb-3700ee59898f","Type":"ContainerStarted","Data":"d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93"} Mar 20 16:15:50 crc kubenswrapper[4675]: I0320 16:15:50.486207 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qw47k" podStartSLOduration=3.062323779 podStartE2EDuration="5.486182049s" podCreationTimestamp="2026-03-20 16:15:45 +0000 UTC" firstStartedPulling="2026-03-20 16:15:47.441275481 +0000 UTC m=+867.474905068" lastFinishedPulling="2026-03-20 16:15:49.865133791 +0000 UTC m=+869.898763338" observedRunningTime="2026-03-20 16:15:50.483124372 +0000 UTC m=+870.516753919" watchObservedRunningTime="2026-03-20 16:15:50.486182049 +0000 UTC m=+870.519811626" Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.470392 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rmjwn" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="registry-server" containerID="cri-o://354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e" gracePeriod=2 Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.835739 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.917176 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-catalog-content\") pod \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.919022 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqcvq\" (UniqueName: \"kubernetes.io/projected/e2461648-2cc2-4b16-9e85-5d6a3af9b072-kube-api-access-mqcvq\") pod \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.919090 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-utilities\") pod \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\" (UID: \"e2461648-2cc2-4b16-9e85-5d6a3af9b072\") " Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.920279 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-utilities" (OuterVolumeSpecName: "utilities") pod "e2461648-2cc2-4b16-9e85-5d6a3af9b072" (UID: "e2461648-2cc2-4b16-9e85-5d6a3af9b072"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:51 crc kubenswrapper[4675]: I0320 16:15:51.926066 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2461648-2cc2-4b16-9e85-5d6a3af9b072-kube-api-access-mqcvq" (OuterVolumeSpecName: "kube-api-access-mqcvq") pod "e2461648-2cc2-4b16-9e85-5d6a3af9b072" (UID: "e2461648-2cc2-4b16-9e85-5d6a3af9b072"). InnerVolumeSpecName "kube-api-access-mqcvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.020901 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.020936 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqcvq\" (UniqueName: \"kubernetes.io/projected/e2461648-2cc2-4b16-9e85-5d6a3af9b072-kube-api-access-mqcvq\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.447275 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2461648-2cc2-4b16-9e85-5d6a3af9b072" (UID: "e2461648-2cc2-4b16-9e85-5d6a3af9b072"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.482892 4675 generic.go:334] "Generic (PLEG): container finished" podID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerID="354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e" exitCode=0 Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.483096 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerDied","Data":"354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e"} Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.483201 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmjwn" event={"ID":"e2461648-2cc2-4b16-9e85-5d6a3af9b072","Type":"ContainerDied","Data":"2f9b6c27c605da4b9e8809cac11e2765152f8e4e6bd4027559b81f540140dc0e"} Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.483110 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmjwn" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.483464 4675 scope.go:117] "RemoveContainer" containerID="354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.516822 4675 scope.go:117] "RemoveContainer" containerID="39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.526085 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2461648-2cc2-4b16-9e85-5d6a3af9b072-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.550541 4675 scope.go:117] "RemoveContainer" containerID="1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.578172 4675 scope.go:117] "RemoveContainer" containerID="354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e" Mar 20 16:15:52 crc kubenswrapper[4675]: E0320 16:15:52.579112 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e\": container with ID starting with 354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e not found: ID does not exist" containerID="354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.579192 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e"} err="failed to get container status \"354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e\": rpc error: code = NotFound desc = could not find container \"354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e\": container with ID starting with 354f8a4f44822fc948126dda8331b782cdb1a9ba319003af439a483a49d4274e not found: ID does not exist" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.579221 4675 scope.go:117] "RemoveContainer" containerID="39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435" Mar 20 16:15:52 crc kubenswrapper[4675]: E0320 16:15:52.579971 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435\": container with ID starting with 39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435 not found: ID does not exist" containerID="39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.580002 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435"} err="failed to get container status \"39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435\": rpc error: code = NotFound desc = could not find container \"39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435\": container with ID starting with 39cb6ca723ca3a3f4c0d6e8b9500ba9b962a3cf27ae68e28dc426d5d22d8f435 not found: ID does not exist" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.580020 4675 scope.go:117] "RemoveContainer" containerID="1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5" Mar 20 16:15:52 crc kubenswrapper[4675]: E0320 16:15:52.580468 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5\": container with ID starting with 1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5 not found: ID does not exist" containerID="1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.580494 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5"} err="failed to get container status \"1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5\": rpc error: code = NotFound desc = could not find container \"1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5\": container with ID starting with 1d1f437c34419558a989e34436d0f70608ff068d8cd2be054801f732a19fa6f5 not found: ID does not exist" Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.582634 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmjwn"] Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.587410 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rmjwn"] Mar 20 16:15:52 crc kubenswrapper[4675]: I0320 16:15:52.688898 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" path="/var/lib/kubelet/pods/e2461648-2cc2-4b16-9e85-5d6a3af9b072/volumes" Mar 20 16:15:56 crc kubenswrapper[4675]: I0320 16:15:56.222036 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:56 crc kubenswrapper[4675]: I0320 16:15:56.222419 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:56 crc kubenswrapper[4675]: I0320 16:15:56.296790 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:56 crc kubenswrapper[4675]: I0320 16:15:56.554501 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:57 crc kubenswrapper[4675]: I0320 16:15:57.259424 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qw47k"] Mar 20 16:15:58 crc kubenswrapper[4675]: I0320 16:15:58.522440 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qw47k" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="registry-server" containerID="cri-o://d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93" gracePeriod=2 Mar 20 16:15:58 crc kubenswrapper[4675]: I0320 16:15:58.950695 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.009229 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-utilities\") pod \"432c49bc-732b-4769-b8bb-3700ee59898f\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.009327 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-catalog-content\") pod \"432c49bc-732b-4769-b8bb-3700ee59898f\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.009400 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdc7d\" (UniqueName: \"kubernetes.io/projected/432c49bc-732b-4769-b8bb-3700ee59898f-kube-api-access-rdc7d\") pod \"432c49bc-732b-4769-b8bb-3700ee59898f\" (UID: \"432c49bc-732b-4769-b8bb-3700ee59898f\") " Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.010167 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-utilities" (OuterVolumeSpecName: "utilities") pod "432c49bc-732b-4769-b8bb-3700ee59898f" (UID: "432c49bc-732b-4769-b8bb-3700ee59898f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.015874 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432c49bc-732b-4769-b8bb-3700ee59898f-kube-api-access-rdc7d" (OuterVolumeSpecName: "kube-api-access-rdc7d") pod "432c49bc-732b-4769-b8bb-3700ee59898f" (UID: "432c49bc-732b-4769-b8bb-3700ee59898f"). InnerVolumeSpecName "kube-api-access-rdc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.111447 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdc7d\" (UniqueName: \"kubernetes.io/projected/432c49bc-732b-4769-b8bb-3700ee59898f-kube-api-access-rdc7d\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.111494 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.435607 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkw9g" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.530673 4675 generic.go:334] "Generic (PLEG): container finished" podID="432c49bc-732b-4769-b8bb-3700ee59898f" containerID="d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93" exitCode=0 Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.530712 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qw47k" event={"ID":"432c49bc-732b-4769-b8bb-3700ee59898f","Type":"ContainerDied","Data":"d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93"} Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.530735 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qw47k" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.530754 4675 scope.go:117] "RemoveContainer" containerID="d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.530744 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qw47k" event={"ID":"432c49bc-732b-4769-b8bb-3700ee59898f","Type":"ContainerDied","Data":"3d103b7e6f95802521735751006f24f87112e50cad56feca9e07922fd13e70f6"} Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.546041 4675 scope.go:117] "RemoveContainer" containerID="c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.565476 4675 scope.go:117] "RemoveContainer" containerID="c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.585619 4675 scope.go:117] "RemoveContainer" containerID="d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93" Mar 20 16:15:59 crc kubenswrapper[4675]: E0320 16:15:59.586085 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93\": container with ID starting with d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93 not found: ID does not exist" containerID="d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.586131 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93"} err="failed to get container status \"d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93\": rpc error: code = NotFound desc = could not find container \"d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93\": container with ID starting with d269ae07479292826cab2a89591bb8b88ca15cb77c456763d7fc4044abe3ff93 not found: ID does not exist" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.586165 4675 scope.go:117] "RemoveContainer" containerID="c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25" Mar 20 16:15:59 crc kubenswrapper[4675]: E0320 16:15:59.586477 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25\": container with ID starting with c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25 not found: ID does not exist" containerID="c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.586507 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25"} err="failed to get container status \"c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25\": rpc error: code = NotFound desc = could not find container \"c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25\": container with ID starting with c861fbfd4c1240c39001d095d790525caf3d30f1a27bd9ed1508f3448fc36c25 not found: ID does not exist" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.586544 4675 scope.go:117] "RemoveContainer" containerID="c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d" Mar 20 16:15:59 crc kubenswrapper[4675]: E0320 16:15:59.586816 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d\": container with ID starting with c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d not found: ID does not exist" containerID="c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.586852 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d"} err="failed to get container status \"c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d\": rpc error: code = NotFound desc = could not find container \"c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d\": container with ID starting with c87ee73a51b0ca395a5a8a41511d5c53fcb7731816f55b2d24e4a641e4c6517d not found: ID does not exist" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.895184 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "432c49bc-732b-4769-b8bb-3700ee59898f" (UID: "432c49bc-732b-4769-b8bb-3700ee59898f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:59 crc kubenswrapper[4675]: I0320 16:15:59.925114 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432c49bc-732b-4769-b8bb-3700ee59898f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140519 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567056-7z8lw"] Mar 20 16:16:00 crc kubenswrapper[4675]: E0320 16:16:00.140720 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="extract-utilities" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140732 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="extract-utilities" Mar 20 16:16:00 crc kubenswrapper[4675]: E0320 16:16:00.140741 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="extract-content" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140747 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="extract-content" Mar 20 16:16:00 crc kubenswrapper[4675]: E0320 16:16:00.140757 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="registry-server" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140775 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="registry-server" Mar 20 16:16:00 crc kubenswrapper[4675]: E0320 16:16:00.140785 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="extract-utilities" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140791 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="extract-utilities" Mar 20 16:16:00 crc kubenswrapper[4675]: E0320 16:16:00.140800 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="extract-content" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140805 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="extract-content" Mar 20 16:16:00 crc kubenswrapper[4675]: E0320 16:16:00.140814 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="registry-server" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140819 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="registry-server" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140902 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" containerName="registry-server" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.140912 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2461648-2cc2-4b16-9e85-5d6a3af9b072" containerName="registry-server" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.141240 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.144170 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.144231 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.144442 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.156034 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-7z8lw"] Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.184337 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qw47k"] Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.190757 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qw47k"] Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.229655 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jphsv\" (UniqueName: \"kubernetes.io/projected/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6-kube-api-access-jphsv\") pod \"auto-csr-approver-29567056-7z8lw\" (UID: \"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6\") " pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.331546 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jphsv\" (UniqueName: \"kubernetes.io/projected/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6-kube-api-access-jphsv\") pod \"auto-csr-approver-29567056-7z8lw\" (UID: \"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6\") " pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.349046 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jphsv\" (UniqueName: \"kubernetes.io/projected/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6-kube-api-access-jphsv\") pod \"auto-csr-approver-29567056-7z8lw\" (UID: \"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6\") " pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.475235 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.668144 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-7z8lw"] Mar 20 16:16:00 crc kubenswrapper[4675]: I0320 16:16:00.680438 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432c49bc-732b-4769-b8bb-3700ee59898f" path="/var/lib/kubelet/pods/432c49bc-732b-4769-b8bb-3700ee59898f/volumes" Mar 20 16:16:01 crc kubenswrapper[4675]: I0320 16:16:01.543994 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" event={"ID":"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6","Type":"ContainerStarted","Data":"b443a8950165dc2f3f2eee04c1a29d95ad13b76e6054797f7f4d31c1af86ab13"} Mar 20 16:16:03 crc kubenswrapper[4675]: I0320 16:16:03.557657 4675 generic.go:334] "Generic (PLEG): container finished" podID="3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6" containerID="9dfba5c9c08b8127e29669df841cfb70228b4a7a8415cb43c4a31498c4a432ec" exitCode=0 Mar 20 16:16:03 crc kubenswrapper[4675]: I0320 16:16:03.557747 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" event={"ID":"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6","Type":"ContainerDied","Data":"9dfba5c9c08b8127e29669df841cfb70228b4a7a8415cb43c4a31498c4a432ec"} Mar 20 16:16:04 crc kubenswrapper[4675]: I0320 16:16:04.801066 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:04 crc kubenswrapper[4675]: I0320 16:16:04.879246 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jphsv\" (UniqueName: \"kubernetes.io/projected/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6-kube-api-access-jphsv\") pod \"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6\" (UID: \"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6\") " Mar 20 16:16:04 crc kubenswrapper[4675]: I0320 16:16:04.888033 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6-kube-api-access-jphsv" (OuterVolumeSpecName: "kube-api-access-jphsv") pod "3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6" (UID: "3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6"). InnerVolumeSpecName "kube-api-access-jphsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:16:04 crc kubenswrapper[4675]: I0320 16:16:04.981752 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jphsv\" (UniqueName: \"kubernetes.io/projected/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6-kube-api-access-jphsv\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:05 crc kubenswrapper[4675]: I0320 16:16:05.581517 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" event={"ID":"3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6","Type":"ContainerDied","Data":"b443a8950165dc2f3f2eee04c1a29d95ad13b76e6054797f7f4d31c1af86ab13"} Mar 20 16:16:05 crc kubenswrapper[4675]: I0320 16:16:05.581579 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b443a8950165dc2f3f2eee04c1a29d95ad13b76e6054797f7f4d31c1af86ab13" Mar 20 16:16:05 crc kubenswrapper[4675]: I0320 16:16:05.581604 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-7z8lw" Mar 20 16:16:05 crc kubenswrapper[4675]: I0320 16:16:05.871370 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-5pkzv"] Mar 20 16:16:05 crc kubenswrapper[4675]: I0320 16:16:05.878440 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-5pkzv"] Mar 20 16:16:06 crc kubenswrapper[4675]: I0320 16:16:06.682611 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a5c91f-fec6-410d-a21f-7cf516d89f78" path="/var/lib/kubelet/pods/c5a5c91f-fec6-410d-a21f-7cf516d89f78/volumes" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.026534 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4"] Mar 20 16:16:09 crc kubenswrapper[4675]: E0320 16:16:09.027002 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6" containerName="oc" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.027015 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6" containerName="oc" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.027143 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6" containerName="oc" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.028027 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.031529 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.048829 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpnj\" (UniqueName: \"kubernetes.io/projected/b1368284-4a62-4656-b794-96aa0ed4e775-kube-api-access-4tpnj\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.048871 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.049172 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.066004 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4"] Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.150822 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.150918 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpnj\" (UniqueName: \"kubernetes.io/projected/b1368284-4a62-4656-b794-96aa0ed4e775-kube-api-access-4tpnj\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.150948 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.151344 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.151419 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.167464 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpnj\" (UniqueName: \"kubernetes.io/projected/b1368284-4a62-4656-b794-96aa0ed4e775-kube-api-access-4tpnj\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.356306 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:09 crc kubenswrapper[4675]: I0320 16:16:09.787187 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4"] Mar 20 16:16:09 crc kubenswrapper[4675]: W0320 16:16:09.798016 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1368284_4a62_4656_b794_96aa0ed4e775.slice/crio-8a6c3800bc355e5f14674f43545b212cee244ce187b9a0b4b4cd6c213767b492 WatchSource:0}: Error finding container 8a6c3800bc355e5f14674f43545b212cee244ce187b9a0b4b4cd6c213767b492: Status 404 returned error can't find the container with id 8a6c3800bc355e5f14674f43545b212cee244ce187b9a0b4b4cd6c213767b492 Mar 20 16:16:10 crc kubenswrapper[4675]: I0320 16:16:10.615843 4675 generic.go:334] "Generic (PLEG): container finished" podID="b1368284-4a62-4656-b794-96aa0ed4e775" containerID="45ce9ad64b66a6207ff27393eb837d0ee6191caae03ff88be3af52d143cc0828" exitCode=0 Mar 20 16:16:10 crc kubenswrapper[4675]: I0320 16:16:10.615917 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" event={"ID":"b1368284-4a62-4656-b794-96aa0ed4e775","Type":"ContainerDied","Data":"45ce9ad64b66a6207ff27393eb837d0ee6191caae03ff88be3af52d143cc0828"} Mar 20 16:16:10 crc kubenswrapper[4675]: I0320 16:16:10.616234 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" event={"ID":"b1368284-4a62-4656-b794-96aa0ed4e775","Type":"ContainerStarted","Data":"8a6c3800bc355e5f14674f43545b212cee244ce187b9a0b4b4cd6c213767b492"} Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.361187 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jv4b7"] Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.362194 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.373569 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jv4b7"] Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.377713 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fgl\" (UniqueName: \"kubernetes.io/projected/9c763cd9-4084-492d-b1e3-304714bec0d0-kube-api-access-79fgl\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.377761 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-catalog-content\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.377825 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-utilities\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.479078 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-catalog-content\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.479144 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-utilities\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.479195 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fgl\" (UniqueName: \"kubernetes.io/projected/9c763cd9-4084-492d-b1e3-304714bec0d0-kube-api-access-79fgl\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.479921 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-catalog-content\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.480204 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-utilities\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.504177 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fgl\" (UniqueName: \"kubernetes.io/projected/9c763cd9-4084-492d-b1e3-304714bec0d0-kube-api-access-79fgl\") pod \"redhat-operators-jv4b7\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.682864 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:11 crc kubenswrapper[4675]: I0320 16:16:11.889540 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jv4b7"] Mar 20 16:16:12 crc kubenswrapper[4675]: I0320 16:16:12.629249 4675 generic.go:334] "Generic (PLEG): container finished" podID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerID="a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005" exitCode=0 Mar 20 16:16:12 crc kubenswrapper[4675]: I0320 16:16:12.629495 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerDied","Data":"a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005"} Mar 20 16:16:12 crc kubenswrapper[4675]: I0320 16:16:12.629526 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerStarted","Data":"7c9617fe14ffbc76b39e98ad8f10e987943d29699c7e77cd5b2b8b4a0b50f5d8"} Mar 20 16:16:13 crc kubenswrapper[4675]: I0320 16:16:13.636885 4675 generic.go:334] "Generic (PLEG): container finished" podID="b1368284-4a62-4656-b794-96aa0ed4e775" containerID="d93074e92ae3573bfbc11b291cec35c8b6a9dbace27ad1ce4d76163255fc768a" exitCode=0 Mar 20 16:16:13 crc kubenswrapper[4675]: I0320 16:16:13.636943 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" event={"ID":"b1368284-4a62-4656-b794-96aa0ed4e775","Type":"ContainerDied","Data":"d93074e92ae3573bfbc11b291cec35c8b6a9dbace27ad1ce4d76163255fc768a"} Mar 20 16:16:14 crc kubenswrapper[4675]: I0320 16:16:14.648091 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerStarted","Data":"f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362"} Mar 20 16:16:14 crc kubenswrapper[4675]: I0320 16:16:14.652343 4675 generic.go:334] "Generic (PLEG): container finished" podID="b1368284-4a62-4656-b794-96aa0ed4e775" containerID="35c69861d19b15df0840f6d3f18df30e3967c3d72881fde7598b2382784629fa" exitCode=0 Mar 20 16:16:14 crc kubenswrapper[4675]: I0320 16:16:14.652418 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" event={"ID":"b1368284-4a62-4656-b794-96aa0ed4e775","Type":"ContainerDied","Data":"35c69861d19b15df0840f6d3f18df30e3967c3d72881fde7598b2382784629fa"} Mar 20 16:16:15 crc kubenswrapper[4675]: I0320 16:16:15.663910 4675 generic.go:334] "Generic (PLEG): container finished" podID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerID="f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362" exitCode=0 Mar 20 16:16:15 crc kubenswrapper[4675]: I0320 16:16:15.664025 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerDied","Data":"f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362"} Mar 20 16:16:15 crc kubenswrapper[4675]: I0320 16:16:15.953264 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.126540 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tpnj\" (UniqueName: \"kubernetes.io/projected/b1368284-4a62-4656-b794-96aa0ed4e775-kube-api-access-4tpnj\") pod \"b1368284-4a62-4656-b794-96aa0ed4e775\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.126724 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-util\") pod \"b1368284-4a62-4656-b794-96aa0ed4e775\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.126856 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-bundle\") pod \"b1368284-4a62-4656-b794-96aa0ed4e775\" (UID: \"b1368284-4a62-4656-b794-96aa0ed4e775\") " Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.127954 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-bundle" (OuterVolumeSpecName: "bundle") pod "b1368284-4a62-4656-b794-96aa0ed4e775" (UID: "b1368284-4a62-4656-b794-96aa0ed4e775"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.134055 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1368284-4a62-4656-b794-96aa0ed4e775-kube-api-access-4tpnj" (OuterVolumeSpecName: "kube-api-access-4tpnj") pod "b1368284-4a62-4656-b794-96aa0ed4e775" (UID: "b1368284-4a62-4656-b794-96aa0ed4e775"). InnerVolumeSpecName "kube-api-access-4tpnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.140966 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-util" (OuterVolumeSpecName: "util") pod "b1368284-4a62-4656-b794-96aa0ed4e775" (UID: "b1368284-4a62-4656-b794-96aa0ed4e775"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.228186 4675 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-util\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.228239 4675 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1368284-4a62-4656-b794-96aa0ed4e775-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.228259 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tpnj\" (UniqueName: \"kubernetes.io/projected/b1368284-4a62-4656-b794-96aa0ed4e775-kube-api-access-4tpnj\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.677270 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.688115 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" event={"ID":"b1368284-4a62-4656-b794-96aa0ed4e775","Type":"ContainerDied","Data":"8a6c3800bc355e5f14674f43545b212cee244ce187b9a0b4b4cd6c213767b492"} Mar 20 16:16:16 crc kubenswrapper[4675]: I0320 16:16:16.688172 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a6c3800bc355e5f14674f43545b212cee244ce187b9a0b4b4cd6c213767b492" Mar 20 16:16:17 crc kubenswrapper[4675]: I0320 16:16:17.687671 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerStarted","Data":"38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f"} Mar 20 16:16:17 crc kubenswrapper[4675]: I0320 16:16:17.713363 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jv4b7" podStartSLOduration=2.864131504 podStartE2EDuration="6.7133405s" podCreationTimestamp="2026-03-20 16:16:11 +0000 UTC" firstStartedPulling="2026-03-20 16:16:12.801465863 +0000 UTC m=+892.835095390" lastFinishedPulling="2026-03-20 16:16:16.650674839 +0000 UTC m=+896.684304386" observedRunningTime="2026-03-20 16:16:17.711145877 +0000 UTC m=+897.744775464" watchObservedRunningTime="2026-03-20 16:16:17.7133405 +0000 UTC m=+897.746970047" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.674901 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f"] Mar 20 16:16:19 crc kubenswrapper[4675]: E0320 16:16:19.675167 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="extract" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.675184 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="extract" Mar 20 16:16:19 crc kubenswrapper[4675]: E0320 16:16:19.675196 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="util" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.675203 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="util" Mar 20 16:16:19 crc kubenswrapper[4675]: E0320 16:16:19.675214 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="pull" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.675221 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="pull" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.675344 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" containerName="extract" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.675845 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.680581 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.680695 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.680896 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-8dqh2" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.702187 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f"] Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.777907 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkpmv\" (UniqueName: \"kubernetes.io/projected/193d5d89-d3ca-4090-abdb-284ad7cc91f9-kube-api-access-xkpmv\") pod \"nmstate-operator-796d4cfff4-4xn7f\" (UID: \"193d5d89-d3ca-4090-abdb-284ad7cc91f9\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.879539 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkpmv\" (UniqueName: \"kubernetes.io/projected/193d5d89-d3ca-4090-abdb-284ad7cc91f9-kube-api-access-xkpmv\") pod \"nmstate-operator-796d4cfff4-4xn7f\" (UID: \"193d5d89-d3ca-4090-abdb-284ad7cc91f9\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.901907 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkpmv\" (UniqueName: \"kubernetes.io/projected/193d5d89-d3ca-4090-abdb-284ad7cc91f9-kube-api-access-xkpmv\") pod \"nmstate-operator-796d4cfff4-4xn7f\" (UID: \"193d5d89-d3ca-4090-abdb-284ad7cc91f9\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" Mar 20 16:16:19 crc kubenswrapper[4675]: I0320 16:16:19.992337 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" Mar 20 16:16:20 crc kubenswrapper[4675]: I0320 16:16:20.399791 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f"] Mar 20 16:16:20 crc kubenswrapper[4675]: I0320 16:16:20.725253 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" event={"ID":"193d5d89-d3ca-4090-abdb-284ad7cc91f9","Type":"ContainerStarted","Data":"23284fac537d24a1cd593e6339a60aea89d753a46c63a48bd3757af8f9fcff56"} Mar 20 16:16:21 crc kubenswrapper[4675]: I0320 16:16:21.683218 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:21 crc kubenswrapper[4675]: I0320 16:16:21.683563 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:22 crc kubenswrapper[4675]: I0320 16:16:22.742239 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jv4b7" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="registry-server" probeResult="failure" output=< Mar 20 16:16:22 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Mar 20 16:16:22 crc kubenswrapper[4675]: > Mar 20 16:16:24 crc kubenswrapper[4675]: I0320 16:16:24.750900 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" event={"ID":"193d5d89-d3ca-4090-abdb-284ad7cc91f9","Type":"ContainerStarted","Data":"4ccaf82cdcd65f543a5876c6e44cfda3c732af4d309c50fa6af9d882916fdaba"} Mar 20 16:16:24 crc kubenswrapper[4675]: I0320 16:16:24.776381 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4xn7f" podStartSLOduration=2.497271406 podStartE2EDuration="5.776362641s" podCreationTimestamp="2026-03-20 16:16:19 +0000 UTC" firstStartedPulling="2026-03-20 16:16:20.412615239 +0000 UTC m=+900.446244816" lastFinishedPulling="2026-03-20 16:16:23.691706514 +0000 UTC m=+903.725336051" observedRunningTime="2026-03-20 16:16:24.776077743 +0000 UTC m=+904.809707290" watchObservedRunningTime="2026-03-20 16:16:24.776362641 +0000 UTC m=+904.809992178" Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.924115 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64"] Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.925385 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.931128 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wb5nj" Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.934957 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64"] Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.942629 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt"] Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.943476 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.947879 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.960923 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt"] Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.989800 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-422rl"] Mar 20 16:16:28 crc kubenswrapper[4675]: I0320 16:16:28.990700 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.014562 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrnqp\" (UniqueName: \"kubernetes.io/projected/e4f94194-a598-4ec7-aac0-ba8e1c3e3e34-kube-api-access-hrnqp\") pod \"nmstate-metrics-9b8c8685d-t7r64\" (UID: \"e4f94194-a598-4ec7-aac0-ba8e1c3e3e34\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.093257 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt"] Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.093888 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.095343 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.095551 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kwd7k" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.095675 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.115642 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-ovs-socket\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.115703 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-nmstate-lock\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.115895 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq6wn\" (UniqueName: \"kubernetes.io/projected/13927470-4093-4096-8a10-e1bdc0443571-kube-api-access-gq6wn\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.115961 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrnqp\" (UniqueName: \"kubernetes.io/projected/e4f94194-a598-4ec7-aac0-ba8e1c3e3e34-kube-api-access-hrnqp\") pod \"nmstate-metrics-9b8c8685d-t7r64\" (UID: \"e4f94194-a598-4ec7-aac0-ba8e1c3e3e34\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.115995 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-dbus-socket\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.116029 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4mv\" (UniqueName: \"kubernetes.io/projected/84187302-9f43-4f51-882d-d1cbdb1e22a3-kube-api-access-sm4mv\") pod \"nmstate-webhook-5f558f5558-hc7pt\" (UID: \"84187302-9f43-4f51-882d-d1cbdb1e22a3\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.116047 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/84187302-9f43-4f51-882d-d1cbdb1e22a3-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-hc7pt\" (UID: \"84187302-9f43-4f51-882d-d1cbdb1e22a3\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.128567 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt"] Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.140317 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrnqp\" (UniqueName: \"kubernetes.io/projected/e4f94194-a598-4ec7-aac0-ba8e1c3e3e34-kube-api-access-hrnqp\") pod \"nmstate-metrics-9b8c8685d-t7r64\" (UID: \"e4f94194-a598-4ec7-aac0-ba8e1c3e3e34\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.217617 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218017 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq6wn\" (UniqueName: \"kubernetes.io/projected/13927470-4093-4096-8a10-e1bdc0443571-kube-api-access-gq6wn\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218065 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-dbus-socket\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218105 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4mv\" (UniqueName: \"kubernetes.io/projected/84187302-9f43-4f51-882d-d1cbdb1e22a3-kube-api-access-sm4mv\") pod \"nmstate-webhook-5f558f5558-hc7pt\" (UID: \"84187302-9f43-4f51-882d-d1cbdb1e22a3\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218140 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/84187302-9f43-4f51-882d-d1cbdb1e22a3-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-hc7pt\" (UID: \"84187302-9f43-4f51-882d-d1cbdb1e22a3\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218176 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218227 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dj9\" (UniqueName: \"kubernetes.io/projected/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-kube-api-access-72dj9\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218261 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-ovs-socket\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218297 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-nmstate-lock\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218394 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-ovs-socket\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218430 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-dbus-socket\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.218443 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/13927470-4093-4096-8a10-e1bdc0443571-nmstate-lock\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.230389 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/84187302-9f43-4f51-882d-d1cbdb1e22a3-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-hc7pt\" (UID: \"84187302-9f43-4f51-882d-d1cbdb1e22a3\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.234254 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq6wn\" (UniqueName: \"kubernetes.io/projected/13927470-4093-4096-8a10-e1bdc0443571-kube-api-access-gq6wn\") pod \"nmstate-handler-422rl\" (UID: \"13927470-4093-4096-8a10-e1bdc0443571\") " pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.237578 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4mv\" (UniqueName: \"kubernetes.io/projected/84187302-9f43-4f51-882d-d1cbdb1e22a3-kube-api-access-sm4mv\") pod \"nmstate-webhook-5f558f5558-hc7pt\" (UID: \"84187302-9f43-4f51-882d-d1cbdb1e22a3\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.243211 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.257798 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.278445 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f4db9b8cf-bh9bs"] Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.279308 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.287466 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f4db9b8cf-bh9bs"] Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.306291 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.319876 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.319942 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.319981 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72dj9\" (UniqueName: \"kubernetes.io/projected/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-kube-api-access-72dj9\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.321144 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.323982 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.338563 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dj9\" (UniqueName: \"kubernetes.io/projected/48914fc8-8ca5-43e7-9048-11d34e7d4ed4-kube-api-access-72dj9\") pod \"nmstate-console-plugin-86f58fcf4-hkvrt\" (UID: \"48914fc8-8ca5-43e7-9048-11d34e7d4ed4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.409579 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422343 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-service-ca\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422396 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-trusted-ca-bundle\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422419 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-serving-cert\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422601 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-config\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422635 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-oauth-config\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422660 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-oauth-serving-cert\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.422702 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlwf\" (UniqueName: \"kubernetes.io/projected/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-kube-api-access-tvlwf\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.487667 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64"] Mar 20 16:16:29 crc kubenswrapper[4675]: W0320 16:16:29.494512 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f94194_a598_4ec7_aac0_ba8e1c3e3e34.slice/crio-c80b8cbbcc4753bf0088fe1086f703cf97010c8994ace1980e8eb116d64a9a6a WatchSource:0}: Error finding container c80b8cbbcc4753bf0088fe1086f703cf97010c8994ace1980e8eb116d64a9a6a: Status 404 returned error can't find the container with id c80b8cbbcc4753bf0088fe1086f703cf97010c8994ace1980e8eb116d64a9a6a Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.523940 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-config\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.523978 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-oauth-config\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.524001 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-oauth-serving-cert\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.524026 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlwf\" (UniqueName: \"kubernetes.io/projected/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-kube-api-access-tvlwf\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.524059 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-service-ca\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.524080 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-trusted-ca-bundle\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.524095 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-serving-cert\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.525081 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-oauth-serving-cert\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.525622 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-config\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.525729 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-trusted-ca-bundle\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.526086 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-service-ca\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.527530 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt"] Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.531733 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-serving-cert\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.539265 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-console-oauth-config\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.545100 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvlwf\" (UniqueName: \"kubernetes.io/projected/d0d58a3d-1c7c-4ce8-83a7-646d2875bb06-kube-api-access-tvlwf\") pod \"console-5f4db9b8cf-bh9bs\" (UID: \"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06\") " pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.631063 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.637320 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt"] Mar 20 16:16:29 crc kubenswrapper[4675]: W0320 16:16:29.645843 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48914fc8_8ca5_43e7_9048_11d34e7d4ed4.slice/crio-0515025665a2cbd217845faaac61622a6cb686f6a30e09d3dc8ec126a59f3de3 WatchSource:0}: Error finding container 0515025665a2cbd217845faaac61622a6cb686f6a30e09d3dc8ec126a59f3de3: Status 404 returned error can't find the container with id 0515025665a2cbd217845faaac61622a6cb686f6a30e09d3dc8ec126a59f3de3 Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.792665 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" event={"ID":"e4f94194-a598-4ec7-aac0-ba8e1c3e3e34","Type":"ContainerStarted","Data":"c80b8cbbcc4753bf0088fe1086f703cf97010c8994ace1980e8eb116d64a9a6a"} Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.795432 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" event={"ID":"84187302-9f43-4f51-882d-d1cbdb1e22a3","Type":"ContainerStarted","Data":"b71417b31ba5013f418df9fbe08c61673cada6b6c524f18738350d944feca0be"} Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.798601 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-422rl" event={"ID":"13927470-4093-4096-8a10-e1bdc0443571","Type":"ContainerStarted","Data":"cb46142ac113ee106e59e13e3fdae4f4f4f3b0ad74820265665bbb5ab549678c"} Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.799731 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" event={"ID":"48914fc8-8ca5-43e7-9048-11d34e7d4ed4","Type":"ContainerStarted","Data":"0515025665a2cbd217845faaac61622a6cb686f6a30e09d3dc8ec126a59f3de3"} Mar 20 16:16:29 crc kubenswrapper[4675]: W0320 16:16:29.827566 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d58a3d_1c7c_4ce8_83a7_646d2875bb06.slice/crio-a1de4e127088c63277cc3467c2a05c4fda36c34ab8f99704dc4850162c07988e WatchSource:0}: Error finding container a1de4e127088c63277cc3467c2a05c4fda36c34ab8f99704dc4850162c07988e: Status 404 returned error can't find the container with id a1de4e127088c63277cc3467c2a05c4fda36c34ab8f99704dc4850162c07988e Mar 20 16:16:29 crc kubenswrapper[4675]: I0320 16:16:29.827891 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f4db9b8cf-bh9bs"] Mar 20 16:16:30 crc kubenswrapper[4675]: I0320 16:16:30.806286 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f4db9b8cf-bh9bs" event={"ID":"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06","Type":"ContainerStarted","Data":"0dff3bb14304cfddf85e275d864647813d6c90a37b076086af4415b2dd16e8f4"} Mar 20 16:16:30 crc kubenswrapper[4675]: I0320 16:16:30.806640 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f4db9b8cf-bh9bs" event={"ID":"d0d58a3d-1c7c-4ce8-83a7-646d2875bb06","Type":"ContainerStarted","Data":"a1de4e127088c63277cc3467c2a05c4fda36c34ab8f99704dc4850162c07988e"} Mar 20 16:16:30 crc kubenswrapper[4675]: I0320 16:16:30.821729 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f4db9b8cf-bh9bs" podStartSLOduration=1.8217159710000002 podStartE2EDuration="1.821715971s" podCreationTimestamp="2026-03-20 16:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:16:30.821122915 +0000 UTC m=+910.854752452" watchObservedRunningTime="2026-03-20 16:16:30.821715971 +0000 UTC m=+910.855345508" Mar 20 16:16:31 crc kubenswrapper[4675]: I0320 16:16:31.723888 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:31 crc kubenswrapper[4675]: I0320 16:16:31.768361 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:32 crc kubenswrapper[4675]: I0320 16:16:32.819216 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" event={"ID":"48914fc8-8ca5-43e7-9048-11d34e7d4ed4","Type":"ContainerStarted","Data":"a24074cafd23e8a4458fa2da9d4cd8aa5209ab369505064c398905e804aefebb"} Mar 20 16:16:32 crc kubenswrapper[4675]: I0320 16:16:32.821008 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" event={"ID":"e4f94194-a598-4ec7-aac0-ba8e1c3e3e34","Type":"ContainerStarted","Data":"369574f5785240550dcdd50ba05b250fcdb1e963187b811a71359bc24e2c0a17"} Mar 20 16:16:32 crc kubenswrapper[4675]: I0320 16:16:32.822622 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" event={"ID":"84187302-9f43-4f51-882d-d1cbdb1e22a3","Type":"ContainerStarted","Data":"84ccf036e6d013fbf52dc3c7530aa281c85a99617bbe1aad1846be50bd4b21ef"} Mar 20 16:16:32 crc kubenswrapper[4675]: I0320 16:16:32.823007 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:16:32 crc kubenswrapper[4675]: I0320 16:16:32.838550 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-hkvrt" podStartSLOduration=0.931273577 podStartE2EDuration="3.838534059s" podCreationTimestamp="2026-03-20 16:16:29 +0000 UTC" firstStartedPulling="2026-03-20 16:16:29.647939977 +0000 UTC m=+909.681569514" lastFinishedPulling="2026-03-20 16:16:32.555200469 +0000 UTC m=+912.588829996" observedRunningTime="2026-03-20 16:16:32.837303005 +0000 UTC m=+912.870932562" watchObservedRunningTime="2026-03-20 16:16:32.838534059 +0000 UTC m=+912.872163596" Mar 20 16:16:32 crc kubenswrapper[4675]: I0320 16:16:32.864126 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" podStartSLOduration=1.8503277790000001 podStartE2EDuration="4.864107223s" podCreationTimestamp="2026-03-20 16:16:28 +0000 UTC" firstStartedPulling="2026-03-20 16:16:29.541490017 +0000 UTC m=+909.575119554" lastFinishedPulling="2026-03-20 16:16:32.555269461 +0000 UTC m=+912.588898998" observedRunningTime="2026-03-20 16:16:32.862571991 +0000 UTC m=+912.896201548" watchObservedRunningTime="2026-03-20 16:16:32.864107223 +0000 UTC m=+912.897736780" Mar 20 16:16:33 crc kubenswrapper[4675]: I0320 16:16:33.832802 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-422rl" event={"ID":"13927470-4093-4096-8a10-e1bdc0443571","Type":"ContainerStarted","Data":"cab8bdcbb8513a2b9fd216ebaa98bf4bb9047d2bd1e38206c29672dec0b8ab53"} Mar 20 16:16:33 crc kubenswrapper[4675]: I0320 16:16:33.834113 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:33 crc kubenswrapper[4675]: I0320 16:16:33.856174 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-422rl" podStartSLOduration=2.631841054 podStartE2EDuration="5.856156146s" podCreationTimestamp="2026-03-20 16:16:28 +0000 UTC" firstStartedPulling="2026-03-20 16:16:29.334374461 +0000 UTC m=+909.368003998" lastFinishedPulling="2026-03-20 16:16:32.558689553 +0000 UTC m=+912.592319090" observedRunningTime="2026-03-20 16:16:33.852906168 +0000 UTC m=+913.886535755" watchObservedRunningTime="2026-03-20 16:16:33.856156146 +0000 UTC m=+913.889785703" Mar 20 16:16:33 crc kubenswrapper[4675]: I0320 16:16:33.953510 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jv4b7"] Mar 20 16:16:33 crc kubenswrapper[4675]: I0320 16:16:33.953709 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jv4b7" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="registry-server" containerID="cri-o://38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f" gracePeriod=2 Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.307849 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.396878 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fgl\" (UniqueName: \"kubernetes.io/projected/9c763cd9-4084-492d-b1e3-304714bec0d0-kube-api-access-79fgl\") pod \"9c763cd9-4084-492d-b1e3-304714bec0d0\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.396934 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-catalog-content\") pod \"9c763cd9-4084-492d-b1e3-304714bec0d0\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.396996 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-utilities\") pod \"9c763cd9-4084-492d-b1e3-304714bec0d0\" (UID: \"9c763cd9-4084-492d-b1e3-304714bec0d0\") " Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.400444 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-utilities" (OuterVolumeSpecName: "utilities") pod "9c763cd9-4084-492d-b1e3-304714bec0d0" (UID: "9c763cd9-4084-492d-b1e3-304714bec0d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.402492 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c763cd9-4084-492d-b1e3-304714bec0d0-kube-api-access-79fgl" (OuterVolumeSpecName: "kube-api-access-79fgl") pod "9c763cd9-4084-492d-b1e3-304714bec0d0" (UID: "9c763cd9-4084-492d-b1e3-304714bec0d0"). InnerVolumeSpecName "kube-api-access-79fgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.498757 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.499108 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fgl\" (UniqueName: \"kubernetes.io/projected/9c763cd9-4084-492d-b1e3-304714bec0d0-kube-api-access-79fgl\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.539714 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c763cd9-4084-492d-b1e3-304714bec0d0" (UID: "9c763cd9-4084-492d-b1e3-304714bec0d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.600569 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c763cd9-4084-492d-b1e3-304714bec0d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.842628 4675 generic.go:334] "Generic (PLEG): container finished" podID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerID="38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f" exitCode=0 Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.842819 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jv4b7" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.842822 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerDied","Data":"38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f"} Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.842924 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jv4b7" event={"ID":"9c763cd9-4084-492d-b1e3-304714bec0d0","Type":"ContainerDied","Data":"7c9617fe14ffbc76b39e98ad8f10e987943d29699c7e77cd5b2b8b4a0b50f5d8"} Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.842967 4675 scope.go:117] "RemoveContainer" containerID="38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f" Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.866798 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jv4b7"] Mar 20 16:16:34 crc kubenswrapper[4675]: I0320 16:16:34.870325 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jv4b7"] Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.190647 4675 scope.go:117] "RemoveContainer" containerID="f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.233751 4675 scope.go:117] "RemoveContainer" containerID="a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.256892 4675 scope.go:117] "RemoveContainer" containerID="38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f" Mar 20 16:16:35 crc kubenswrapper[4675]: E0320 16:16:35.257572 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f\": container with ID starting with 38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f not found: ID does not exist" containerID="38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.257612 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f"} err="failed to get container status \"38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f\": rpc error: code = NotFound desc = could not find container \"38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f\": container with ID starting with 38401354390ffcb00a8586a754964e2c677464179ccaa061190da26cf5f2310f not found: ID does not exist" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.257640 4675 scope.go:117] "RemoveContainer" containerID="f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362" Mar 20 16:16:35 crc kubenswrapper[4675]: E0320 16:16:35.258032 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362\": container with ID starting with f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362 not found: ID does not exist" containerID="f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.258067 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362"} err="failed to get container status \"f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362\": rpc error: code = NotFound desc = could not find container \"f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362\": container with ID starting with f5df0384e0ba87f1e2478d217c5843f7669f5602e244ab0890dbb6c24a84a362 not found: ID does not exist" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.258093 4675 scope.go:117] "RemoveContainer" containerID="a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005" Mar 20 16:16:35 crc kubenswrapper[4675]: E0320 16:16:35.258872 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005\": container with ID starting with a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005 not found: ID does not exist" containerID="a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.258896 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005"} err="failed to get container status \"a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005\": rpc error: code = NotFound desc = could not find container \"a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005\": container with ID starting with a6cfd0676875172eac4344d46dee1278ec4413086e6a2ad54395670203550005 not found: ID does not exist" Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.854691 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" event={"ID":"e4f94194-a598-4ec7-aac0-ba8e1c3e3e34","Type":"ContainerStarted","Data":"f148bec3c2690f273febbac4b883c910814e9597e5de6ac933abb07a346152df"} Mar 20 16:16:35 crc kubenswrapper[4675]: I0320 16:16:35.883831 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-t7r64" podStartSLOduration=2.139494248 podStartE2EDuration="7.883729106s" podCreationTimestamp="2026-03-20 16:16:28 +0000 UTC" firstStartedPulling="2026-03-20 16:16:29.496808945 +0000 UTC m=+909.530438482" lastFinishedPulling="2026-03-20 16:16:35.241043803 +0000 UTC m=+915.274673340" observedRunningTime="2026-03-20 16:16:35.871138094 +0000 UTC m=+915.904767691" watchObservedRunningTime="2026-03-20 16:16:35.883729106 +0000 UTC m=+915.917358673" Mar 20 16:16:36 crc kubenswrapper[4675]: I0320 16:16:36.431346 4675 scope.go:117] "RemoveContainer" containerID="cbdae13faf90ce897531798cd7426f8fae027a7002aea03bc349720cc4233a99" Mar 20 16:16:36 crc kubenswrapper[4675]: I0320 16:16:36.707227 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" path="/var/lib/kubelet/pods/9c763cd9-4084-492d-b1e3-304714bec0d0/volumes" Mar 20 16:16:39 crc kubenswrapper[4675]: I0320 16:16:39.348979 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-422rl" Mar 20 16:16:39 crc kubenswrapper[4675]: I0320 16:16:39.632447 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:39 crc kubenswrapper[4675]: I0320 16:16:39.633133 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:39 crc kubenswrapper[4675]: I0320 16:16:39.639096 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:39 crc kubenswrapper[4675]: I0320 16:16:39.891478 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f4db9b8cf-bh9bs" Mar 20 16:16:39 crc kubenswrapper[4675]: I0320 16:16:39.976550 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cnrtx"] Mar 20 16:16:46 crc kubenswrapper[4675]: I0320 16:16:46.708580 4675 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb1368284-4a62-4656-b794-96aa0ed4e775"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb1368284-4a62-4656-b794-96aa0ed4e775] : Timed out while waiting for systemd to remove kubepods-burstable-podb1368284_4a62_4656_b794_96aa0ed4e775.slice" Mar 20 16:16:46 crc kubenswrapper[4675]: E0320 16:16:46.709390 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable podb1368284-4a62-4656-b794-96aa0ed4e775] : unable to destroy cgroup paths for cgroup [kubepods burstable podb1368284-4a62-4656-b794-96aa0ed4e775] : Timed out while waiting for systemd to remove kubepods-burstable-podb1368284_4a62_4656_b794_96aa0ed4e775.slice" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" podUID="b1368284-4a62-4656-b794-96aa0ed4e775" Mar 20 16:16:46 crc kubenswrapper[4675]: I0320 16:16:46.933064 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4" Mar 20 16:16:49 crc kubenswrapper[4675]: I0320 16:16:49.267281 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hc7pt" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.723832 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz"] Mar 20 16:17:01 crc kubenswrapper[4675]: E0320 16:17:01.724516 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="extract-content" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.724530 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="extract-content" Mar 20 16:17:01 crc kubenswrapper[4675]: E0320 16:17:01.724551 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="extract-utilities" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.724559 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="extract-utilities" Mar 20 16:17:01 crc kubenswrapper[4675]: E0320 16:17:01.724573 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="registry-server" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.724581 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="registry-server" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.724705 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c763cd9-4084-492d-b1e3-304714bec0d0" containerName="registry-server" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.725659 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.727716 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz"] Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.728221 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.897748 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9dt\" (UniqueName: \"kubernetes.io/projected/243915a2-beb9-4d55-914c-6c27c64ee50a-kube-api-access-zr9dt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.898226 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.898321 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.999222 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9dt\" (UniqueName: \"kubernetes.io/projected/243915a2-beb9-4d55-914c-6c27c64ee50a-kube-api-access-zr9dt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.999281 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:01 crc kubenswrapper[4675]: I0320 16:17:01.999335 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:02 crc kubenswrapper[4675]: I0320 16:17:01.999944 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:02 crc kubenswrapper[4675]: I0320 16:17:02.000317 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:02 crc kubenswrapper[4675]: I0320 16:17:02.030751 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9dt\" (UniqueName: \"kubernetes.io/projected/243915a2-beb9-4d55-914c-6c27c64ee50a-kube-api-access-zr9dt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:02 crc kubenswrapper[4675]: I0320 16:17:02.048407 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:02 crc kubenswrapper[4675]: I0320 16:17:02.462996 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz"] Mar 20 16:17:02 crc kubenswrapper[4675]: W0320 16:17:02.473409 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243915a2_beb9_4d55_914c_6c27c64ee50a.slice/crio-c67d102a07c6afee89da1b67034eafc2b6084ff7c79f77f1509f1c414b899e57 WatchSource:0}: Error finding container c67d102a07c6afee89da1b67034eafc2b6084ff7c79f77f1509f1c414b899e57: Status 404 returned error can't find the container with id c67d102a07c6afee89da1b67034eafc2b6084ff7c79f77f1509f1c414b899e57 Mar 20 16:17:03 crc kubenswrapper[4675]: I0320 16:17:03.051431 4675 generic.go:334] "Generic (PLEG): container finished" podID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerID="6c4103a1b0c8a38521a360d061a0d82662d3b8358824b168afba331def46493d" exitCode=0 Mar 20 16:17:03 crc kubenswrapper[4675]: I0320 16:17:03.051472 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" event={"ID":"243915a2-beb9-4d55-914c-6c27c64ee50a","Type":"ContainerDied","Data":"6c4103a1b0c8a38521a360d061a0d82662d3b8358824b168afba331def46493d"} Mar 20 16:17:03 crc kubenswrapper[4675]: I0320 16:17:03.051497 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" event={"ID":"243915a2-beb9-4d55-914c-6c27c64ee50a","Type":"ContainerStarted","Data":"c67d102a07c6afee89da1b67034eafc2b6084ff7c79f77f1509f1c414b899e57"} Mar 20 16:17:04 crc kubenswrapper[4675]: I0320 16:17:04.425397 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:17:04 crc kubenswrapper[4675]: I0320 16:17:04.425813 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.064204 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-cnrtx" podUID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" containerName="console" containerID="cri-o://d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1" gracePeriod=15 Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.419941 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cnrtx_ccabe656-71a5-4e5b-b5f8-093e1b38f62c/console/0.log" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.420235 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.453677 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-serving-cert\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.453740 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-trusted-ca-bundle\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.453823 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-oauth-config\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.453871 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-service-ca\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.453931 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-config\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.453975 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2hqx\" (UniqueName: \"kubernetes.io/projected/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-kube-api-access-b2hqx\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.454004 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-oauth-serving-cert\") pod \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\" (UID: \"ccabe656-71a5-4e5b-b5f8-093e1b38f62c\") " Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.454446 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.454621 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.454878 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-service-ca" (OuterVolumeSpecName: "service-ca") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.455134 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-config" (OuterVolumeSpecName: "console-config") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.459981 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.460809 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.466065 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-kube-api-access-b2hqx" (OuterVolumeSpecName: "kube-api-access-b2hqx") pod "ccabe656-71a5-4e5b-b5f8-093e1b38f62c" (UID: "ccabe656-71a5-4e5b-b5f8-093e1b38f62c"). InnerVolumeSpecName "kube-api-access-b2hqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.483685 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f4tk4"] Mar 20 16:17:05 crc kubenswrapper[4675]: E0320 16:17:05.484021 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" containerName="console" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.484047 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" containerName="console" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.484177 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" containerName="console" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.485119 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.493048 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4tk4"] Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.555720 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-utilities\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.555860 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-catalog-content\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.555900 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzg9d\" (UniqueName: \"kubernetes.io/projected/7c112b84-0647-4e64-a97f-8cd88631372d-kube-api-access-pzg9d\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556290 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556307 4675 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556340 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2hqx\" (UniqueName: \"kubernetes.io/projected/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-kube-api-access-b2hqx\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556350 4675 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556361 4675 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556369 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.556377 4675 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccabe656-71a5-4e5b-b5f8-093e1b38f62c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.657890 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-utilities\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.657938 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-catalog-content\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.657964 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzg9d\" (UniqueName: \"kubernetes.io/projected/7c112b84-0647-4e64-a97f-8cd88631372d-kube-api-access-pzg9d\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.658590 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-utilities\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.658636 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-catalog-content\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.676158 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzg9d\" (UniqueName: \"kubernetes.io/projected/7c112b84-0647-4e64-a97f-8cd88631372d-kube-api-access-pzg9d\") pod \"redhat-marketplace-f4tk4\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:05 crc kubenswrapper[4675]: I0320 16:17:05.829901 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.023568 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4tk4"] Mar 20 16:17:06 crc kubenswrapper[4675]: W0320 16:17:06.031923 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c112b84_0647_4e64_a97f_8cd88631372d.slice/crio-fb85c842fcd3d44c6e7319ec13190da94571006e938dce7eb00be9f04d10bd56 WatchSource:0}: Error finding container fb85c842fcd3d44c6e7319ec13190da94571006e938dce7eb00be9f04d10bd56: Status 404 returned error can't find the container with id fb85c842fcd3d44c6e7319ec13190da94571006e938dce7eb00be9f04d10bd56 Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.069109 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4tk4" event={"ID":"7c112b84-0647-4e64-a97f-8cd88631372d","Type":"ContainerStarted","Data":"fb85c842fcd3d44c6e7319ec13190da94571006e938dce7eb00be9f04d10bd56"} Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.070592 4675 generic.go:334] "Generic (PLEG): container finished" podID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerID="836562bd42739599a99380b6b106cd832b3914d377ccce13b65b070bc291b61a" exitCode=0 Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.070628 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" event={"ID":"243915a2-beb9-4d55-914c-6c27c64ee50a","Type":"ContainerDied","Data":"836562bd42739599a99380b6b106cd832b3914d377ccce13b65b070bc291b61a"} Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.074402 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cnrtx_ccabe656-71a5-4e5b-b5f8-093e1b38f62c/console/0.log" Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.074440 4675 generic.go:334] "Generic (PLEG): container finished" podID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" containerID="d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1" exitCode=2 Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.074460 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cnrtx" event={"ID":"ccabe656-71a5-4e5b-b5f8-093e1b38f62c","Type":"ContainerDied","Data":"d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1"} Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.074477 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cnrtx" event={"ID":"ccabe656-71a5-4e5b-b5f8-093e1b38f62c","Type":"ContainerDied","Data":"a1f6de4fc96a8199548db60237b7f23998d843007478d92bdbd8d2fe84c94c4e"} Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.074493 4675 scope.go:117] "RemoveContainer" containerID="d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1" Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.074583 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cnrtx" Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.111496 4675 scope.go:117] "RemoveContainer" containerID="d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1" Mar 20 16:17:06 crc kubenswrapper[4675]: E0320 16:17:06.111981 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1\": container with ID starting with d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1 not found: ID does not exist" containerID="d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1" Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.112021 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1"} err="failed to get container status \"d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1\": rpc error: code = NotFound desc = could not find container \"d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1\": container with ID starting with d337d32c09ee100e3bba7bc4a384b351784ceb46a68abd43e09de735cb2960d1 not found: ID does not exist" Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.121757 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cnrtx"] Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.127663 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-cnrtx"] Mar 20 16:17:06 crc kubenswrapper[4675]: I0320 16:17:06.686450 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccabe656-71a5-4e5b-b5f8-093e1b38f62c" path="/var/lib/kubelet/pods/ccabe656-71a5-4e5b-b5f8-093e1b38f62c/volumes" Mar 20 16:17:07 crc kubenswrapper[4675]: I0320 16:17:07.082793 4675 generic.go:334] "Generic (PLEG): container finished" podID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerID="06f133364880f1559fb0caa55d4402af01f908cdaf2f37f213664c3a71e81285" exitCode=0 Mar 20 16:17:07 crc kubenswrapper[4675]: I0320 16:17:07.082926 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" event={"ID":"243915a2-beb9-4d55-914c-6c27c64ee50a","Type":"ContainerDied","Data":"06f133364880f1559fb0caa55d4402af01f908cdaf2f37f213664c3a71e81285"} Mar 20 16:17:07 crc kubenswrapper[4675]: I0320 16:17:07.087983 4675 generic.go:334] "Generic (PLEG): container finished" podID="7c112b84-0647-4e64-a97f-8cd88631372d" containerID="28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60" exitCode=0 Mar 20 16:17:07 crc kubenswrapper[4675]: I0320 16:17:07.088019 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4tk4" event={"ID":"7c112b84-0647-4e64-a97f-8cd88631372d","Type":"ContainerDied","Data":"28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60"} Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.098614 4675 generic.go:334] "Generic (PLEG): container finished" podID="7c112b84-0647-4e64-a97f-8cd88631372d" containerID="029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775" exitCode=0 Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.098736 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4tk4" event={"ID":"7c112b84-0647-4e64-a97f-8cd88631372d","Type":"ContainerDied","Data":"029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775"} Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.342619 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.393648 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-util\") pod \"243915a2-beb9-4d55-914c-6c27c64ee50a\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.393729 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-bundle\") pod \"243915a2-beb9-4d55-914c-6c27c64ee50a\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.393840 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr9dt\" (UniqueName: \"kubernetes.io/projected/243915a2-beb9-4d55-914c-6c27c64ee50a-kube-api-access-zr9dt\") pod \"243915a2-beb9-4d55-914c-6c27c64ee50a\" (UID: \"243915a2-beb9-4d55-914c-6c27c64ee50a\") " Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.394942 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-bundle" (OuterVolumeSpecName: "bundle") pod "243915a2-beb9-4d55-914c-6c27c64ee50a" (UID: "243915a2-beb9-4d55-914c-6c27c64ee50a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.399131 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243915a2-beb9-4d55-914c-6c27c64ee50a-kube-api-access-zr9dt" (OuterVolumeSpecName: "kube-api-access-zr9dt") pod "243915a2-beb9-4d55-914c-6c27c64ee50a" (UID: "243915a2-beb9-4d55-914c-6c27c64ee50a"). InnerVolumeSpecName "kube-api-access-zr9dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.404431 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-util" (OuterVolumeSpecName: "util") pod "243915a2-beb9-4d55-914c-6c27c64ee50a" (UID: "243915a2-beb9-4d55-914c-6c27c64ee50a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.495461 4675 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-util\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.495496 4675 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243915a2-beb9-4d55-914c-6c27c64ee50a-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:08 crc kubenswrapper[4675]: I0320 16:17:08.495508 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr9dt\" (UniqueName: \"kubernetes.io/projected/243915a2-beb9-4d55-914c-6c27c64ee50a-kube-api-access-zr9dt\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:09 crc kubenswrapper[4675]: I0320 16:17:09.106107 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4tk4" event={"ID":"7c112b84-0647-4e64-a97f-8cd88631372d","Type":"ContainerStarted","Data":"fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614"} Mar 20 16:17:09 crc kubenswrapper[4675]: I0320 16:17:09.108209 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" event={"ID":"243915a2-beb9-4d55-914c-6c27c64ee50a","Type":"ContainerDied","Data":"c67d102a07c6afee89da1b67034eafc2b6084ff7c79f77f1509f1c414b899e57"} Mar 20 16:17:09 crc kubenswrapper[4675]: I0320 16:17:09.108248 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c67d102a07c6afee89da1b67034eafc2b6084ff7c79f77f1509f1c414b899e57" Mar 20 16:17:09 crc kubenswrapper[4675]: I0320 16:17:09.108296 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz" Mar 20 16:17:09 crc kubenswrapper[4675]: I0320 16:17:09.123600 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f4tk4" podStartSLOduration=2.482748389 podStartE2EDuration="4.123580871s" podCreationTimestamp="2026-03-20 16:17:05 +0000 UTC" firstStartedPulling="2026-03-20 16:17:07.090224276 +0000 UTC m=+947.123853813" lastFinishedPulling="2026-03-20 16:17:08.731056748 +0000 UTC m=+948.764686295" observedRunningTime="2026-03-20 16:17:09.120735754 +0000 UTC m=+949.154365301" watchObservedRunningTime="2026-03-20 16:17:09.123580871 +0000 UTC m=+949.157210418" Mar 20 16:17:15 crc kubenswrapper[4675]: I0320 16:17:15.830364 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:15 crc kubenswrapper[4675]: I0320 16:17:15.830969 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:15 crc kubenswrapper[4675]: I0320 16:17:15.867395 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:16 crc kubenswrapper[4675]: I0320 16:17:16.190673 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.068044 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4tk4"] Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.160781 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f4tk4" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="registry-server" containerID="cri-o://fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614" gracePeriod=2 Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.553119 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.713022 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-utilities\") pod \"7c112b84-0647-4e64-a97f-8cd88631372d\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.713075 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzg9d\" (UniqueName: \"kubernetes.io/projected/7c112b84-0647-4e64-a97f-8cd88631372d-kube-api-access-pzg9d\") pod \"7c112b84-0647-4e64-a97f-8cd88631372d\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.713110 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-catalog-content\") pod \"7c112b84-0647-4e64-a97f-8cd88631372d\" (UID: \"7c112b84-0647-4e64-a97f-8cd88631372d\") " Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.713787 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-utilities" (OuterVolumeSpecName: "utilities") pod "7c112b84-0647-4e64-a97f-8cd88631372d" (UID: "7c112b84-0647-4e64-a97f-8cd88631372d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.713896 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.720958 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c112b84-0647-4e64-a97f-8cd88631372d-kube-api-access-pzg9d" (OuterVolumeSpecName: "kube-api-access-pzg9d") pod "7c112b84-0647-4e64-a97f-8cd88631372d" (UID: "7c112b84-0647-4e64-a97f-8cd88631372d"). InnerVolumeSpecName "kube-api-access-pzg9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.737957 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c112b84-0647-4e64-a97f-8cd88631372d" (UID: "7c112b84-0647-4e64-a97f-8cd88631372d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.814880 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzg9d\" (UniqueName: \"kubernetes.io/projected/7c112b84-0647-4e64-a97f-8cd88631372d-kube-api-access-pzg9d\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:18 crc kubenswrapper[4675]: I0320 16:17:18.814924 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c112b84-0647-4e64-a97f-8cd88631372d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.169268 4675 generic.go:334] "Generic (PLEG): container finished" podID="7c112b84-0647-4e64-a97f-8cd88631372d" containerID="fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614" exitCode=0 Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.169319 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4tk4" event={"ID":"7c112b84-0647-4e64-a97f-8cd88631372d","Type":"ContainerDied","Data":"fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614"} Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.169352 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4tk4" event={"ID":"7c112b84-0647-4e64-a97f-8cd88631372d","Type":"ContainerDied","Data":"fb85c842fcd3d44c6e7319ec13190da94571006e938dce7eb00be9f04d10bd56"} Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.169353 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4tk4" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.169371 4675 scope.go:117] "RemoveContainer" containerID="fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.196940 4675 scope.go:117] "RemoveContainer" containerID="029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.203540 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4tk4"] Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.215035 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4tk4"] Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.216872 4675 scope.go:117] "RemoveContainer" containerID="28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.234975 4675 scope.go:117] "RemoveContainer" containerID="fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.235481 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614\": container with ID starting with fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614 not found: ID does not exist" containerID="fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.235514 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614"} err="failed to get container status \"fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614\": rpc error: code = NotFound desc = could not find container \"fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614\": container with ID starting with fed0ca1c0163d2dfc1f13141bc2fd9ce184e99f9d63b59b4f22b950ffd599614 not found: ID does not exist" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.235538 4675 scope.go:117] "RemoveContainer" containerID="029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.235830 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775\": container with ID starting with 029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775 not found: ID does not exist" containerID="029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.235847 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775"} err="failed to get container status \"029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775\": rpc error: code = NotFound desc = could not find container \"029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775\": container with ID starting with 029b2f944927d60a4788d535c7e7fb172134981e2573dd819b9b505bd18ed775 not found: ID does not exist" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.235859 4675 scope.go:117] "RemoveContainer" containerID="28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.236292 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60\": container with ID starting with 28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60 not found: ID does not exist" containerID="28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.236310 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60"} err="failed to get container status \"28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60\": rpc error: code = NotFound desc = could not find container \"28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60\": container with ID starting with 28d1d11af2a480a24795ffbc145e115ef8a00190c7dde20dad863d5ecd864f60 not found: ID does not exist" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.533800 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg"] Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.534397 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="extract" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.534560 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="extract" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.534639 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="extract-content" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.534710 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="extract-content" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.534801 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="registry-server" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.534874 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="registry-server" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.534967 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="extract-utilities" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.535049 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="extract-utilities" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.535137 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="pull" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.535280 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="pull" Mar 20 16:17:19 crc kubenswrapper[4675]: E0320 16:17:19.535357 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="util" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.535429 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="util" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.535649 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" containerName="registry-server" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.535740 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="243915a2-beb9-4d55-914c-6c27c64ee50a" containerName="extract" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.536340 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.538529 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.539025 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rhg8s" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.539277 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.539647 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.539813 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.549176 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg"] Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.624842 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-webhook-cert\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.624912 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmzk\" (UniqueName: \"kubernetes.io/projected/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-kube-api-access-kxmzk\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.624973 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-apiservice-cert\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.726224 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-webhook-cert\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.726278 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmzk\" (UniqueName: \"kubernetes.io/projected/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-kube-api-access-kxmzk\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.726352 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-apiservice-cert\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.732191 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-apiservice-cert\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.742753 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-webhook-cert\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.757396 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmzk\" (UniqueName: \"kubernetes.io/projected/3e995caa-02b4-47fa-9e1f-2f40a8234f0c-kube-api-access-kxmzk\") pod \"metallb-operator-controller-manager-68f54df857-mz9xg\" (UID: \"3e995caa-02b4-47fa-9e1f-2f40a8234f0c\") " pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.852058 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.867491 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm"] Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.868196 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.879575 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-h6k2k" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.880151 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.880350 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 16:17:19 crc kubenswrapper[4675]: I0320 16:17:19.882610 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm"] Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.029788 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2069d30f-51d2-4294-a579-cb0843239946-apiservice-cert\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.029824 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2069d30f-51d2-4294-a579-cb0843239946-webhook-cert\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.029865 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplpp\" (UniqueName: \"kubernetes.io/projected/2069d30f-51d2-4294-a579-cb0843239946-kube-api-access-tplpp\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.082656 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg"] Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.130610 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2069d30f-51d2-4294-a579-cb0843239946-apiservice-cert\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.130653 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2069d30f-51d2-4294-a579-cb0843239946-webhook-cert\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.130692 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplpp\" (UniqueName: \"kubernetes.io/projected/2069d30f-51d2-4294-a579-cb0843239946-kube-api-access-tplpp\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.134959 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2069d30f-51d2-4294-a579-cb0843239946-webhook-cert\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.136261 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2069d30f-51d2-4294-a579-cb0843239946-apiservice-cert\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.153608 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplpp\" (UniqueName: \"kubernetes.io/projected/2069d30f-51d2-4294-a579-cb0843239946-kube-api-access-tplpp\") pod \"metallb-operator-webhook-server-859fb75b66-mjggm\" (UID: \"2069d30f-51d2-4294-a579-cb0843239946\") " pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.176986 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" event={"ID":"3e995caa-02b4-47fa-9e1f-2f40a8234f0c","Type":"ContainerStarted","Data":"67f2998fe3f1b24ddd1bcee892027bcce09f9db155aff9ef208b274c92c91cbc"} Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.206932 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.443949 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm"] Mar 20 16:17:20 crc kubenswrapper[4675]: W0320 16:17:20.452754 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2069d30f_51d2_4294_a579_cb0843239946.slice/crio-675a1eed78099ae275f474857d1a43e0d6b56ce9010dea4f5a391520d7a0a9a7 WatchSource:0}: Error finding container 675a1eed78099ae275f474857d1a43e0d6b56ce9010dea4f5a391520d7a0a9a7: Status 404 returned error can't find the container with id 675a1eed78099ae275f474857d1a43e0d6b56ce9010dea4f5a391520d7a0a9a7 Mar 20 16:17:20 crc kubenswrapper[4675]: I0320 16:17:20.683683 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c112b84-0647-4e64-a97f-8cd88631372d" path="/var/lib/kubelet/pods/7c112b84-0647-4e64-a97f-8cd88631372d/volumes" Mar 20 16:17:21 crc kubenswrapper[4675]: I0320 16:17:21.182697 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" event={"ID":"2069d30f-51d2-4294-a579-cb0843239946","Type":"ContainerStarted","Data":"675a1eed78099ae275f474857d1a43e0d6b56ce9010dea4f5a391520d7a0a9a7"} Mar 20 16:17:24 crc kubenswrapper[4675]: I0320 16:17:24.203569 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" event={"ID":"3e995caa-02b4-47fa-9e1f-2f40a8234f0c","Type":"ContainerStarted","Data":"ba446750e9b0a4569122fdd0d10b2518e2b184f286e927433b69d535ff08b4f4"} Mar 20 16:17:24 crc kubenswrapper[4675]: I0320 16:17:24.204048 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:17:24 crc kubenswrapper[4675]: I0320 16:17:24.228514 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" podStartSLOduration=1.891813725 podStartE2EDuration="5.228489833s" podCreationTimestamp="2026-03-20 16:17:19 +0000 UTC" firstStartedPulling="2026-03-20 16:17:20.096305194 +0000 UTC m=+960.129934721" lastFinishedPulling="2026-03-20 16:17:23.432981292 +0000 UTC m=+963.466610829" observedRunningTime="2026-03-20 16:17:24.222541252 +0000 UTC m=+964.256170789" watchObservedRunningTime="2026-03-20 16:17:24.228489833 +0000 UTC m=+964.262119370" Mar 20 16:17:26 crc kubenswrapper[4675]: I0320 16:17:26.222949 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" event={"ID":"2069d30f-51d2-4294-a579-cb0843239946","Type":"ContainerStarted","Data":"92cacb65194ba3380566c93a2a5e9570c14a7d2c12e5c922d7b953f5416e5611"} Mar 20 16:17:26 crc kubenswrapper[4675]: I0320 16:17:26.223431 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:26 crc kubenswrapper[4675]: I0320 16:17:26.246345 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" podStartSLOduration=1.7579951029999998 podStartE2EDuration="7.246319928s" podCreationTimestamp="2026-03-20 16:17:19 +0000 UTC" firstStartedPulling="2026-03-20 16:17:20.455549274 +0000 UTC m=+960.489178811" lastFinishedPulling="2026-03-20 16:17:25.943874099 +0000 UTC m=+965.977503636" observedRunningTime="2026-03-20 16:17:26.245315261 +0000 UTC m=+966.278944788" watchObservedRunningTime="2026-03-20 16:17:26.246319928 +0000 UTC m=+966.279949455" Mar 20 16:17:34 crc kubenswrapper[4675]: I0320 16:17:34.424517 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:17:34 crc kubenswrapper[4675]: I0320 16:17:34.425150 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:17:40 crc kubenswrapper[4675]: I0320 16:17:40.212086 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-859fb75b66-mjggm" Mar 20 16:17:59 crc kubenswrapper[4675]: I0320 16:17:59.855974 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-68f54df857-mz9xg" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.140711 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567058-nfwrp"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.141482 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.144149 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.144953 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.145860 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.147576 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-nfwrp"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.239521 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpmwv\" (UniqueName: \"kubernetes.io/projected/b738426f-49bb-4a73-b55f-c4840a67a7d5-kube-api-access-tpmwv\") pod \"auto-csr-approver-29567058-nfwrp\" (UID: \"b738426f-49bb-4a73-b55f-c4840a67a7d5\") " pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.340715 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpmwv\" (UniqueName: \"kubernetes.io/projected/b738426f-49bb-4a73-b55f-c4840a67a7d5-kube-api-access-tpmwv\") pod \"auto-csr-approver-29567058-nfwrp\" (UID: \"b738426f-49bb-4a73-b55f-c4840a67a7d5\") " pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.358378 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpmwv\" (UniqueName: \"kubernetes.io/projected/b738426f-49bb-4a73-b55f-c4840a67a7d5-kube-api-access-tpmwv\") pod \"auto-csr-approver-29567058-nfwrp\" (UID: \"b738426f-49bb-4a73-b55f-c4840a67a7d5\") " pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.461242 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.692691 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-64lgp"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.695648 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.698157 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.699521 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.707996 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.708189 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.710480 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.710607 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-djr28" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.720068 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.735820 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-nfwrp"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.800346 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wcb96"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.801439 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wcb96" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.807001 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.807278 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.807485 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.807644 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tk8qs" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.808226 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-phstr"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.809249 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.813705 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.826321 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-phstr"] Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.849169 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-conf\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.849231 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-startup\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.849263 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-metrics-certs\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.849283 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f44b2431-e7ee-41a8-98c0-957a933f8cd8-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jggh8\" (UID: \"f44b2431-e7ee-41a8-98c0-957a933f8cd8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.849310 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqtn\" (UniqueName: \"kubernetes.io/projected/f44b2431-e7ee-41a8-98c0-957a933f8cd8-kube-api-access-vzqtn\") pod \"frr-k8s-webhook-server-bcc4b6f68-jggh8\" (UID: \"f44b2431-e7ee-41a8-98c0-957a933f8cd8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.850205 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-sockets\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.850261 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8z5c\" (UniqueName: \"kubernetes.io/projected/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-kube-api-access-h8z5c\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.850288 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-reloader\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.850324 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-metrics\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951405 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/79348741-9f2d-4c18-be91-9a49e0af8ec8-metallb-excludel2\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951461 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8z5c\" (UniqueName: \"kubernetes.io/projected/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-kube-api-access-h8z5c\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951493 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-reloader\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951521 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-metrics\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951555 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-conf\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951581 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-startup\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951621 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-metrics-certs\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951640 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f44b2431-e7ee-41a8-98c0-957a933f8cd8-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jggh8\" (UID: \"f44b2431-e7ee-41a8-98c0-957a933f8cd8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951661 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-metrics-certs\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951687 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqtn\" (UniqueName: \"kubernetes.io/projected/f44b2431-e7ee-41a8-98c0-957a933f8cd8-kube-api-access-vzqtn\") pod \"frr-k8s-webhook-server-bcc4b6f68-jggh8\" (UID: \"f44b2431-e7ee-41a8-98c0-957a933f8cd8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951709 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbnhm\" (UniqueName: \"kubernetes.io/projected/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-kube-api-access-dbnhm\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951733 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951756 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmlw\" (UniqueName: \"kubernetes.io/projected/79348741-9f2d-4c18-be91-9a49e0af8ec8-kube-api-access-hkmlw\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951801 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-cert\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951827 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-metrics-certs\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951848 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-sockets\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.951987 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-reloader\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.952078 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-metrics\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.952183 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-conf\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.952265 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-sockets\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.952840 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-frr-startup\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.961783 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-metrics-certs\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.963560 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f44b2431-e7ee-41a8-98c0-957a933f8cd8-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jggh8\" (UID: \"f44b2431-e7ee-41a8-98c0-957a933f8cd8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.970315 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8z5c\" (UniqueName: \"kubernetes.io/projected/d2ccda47-7e80-4cab-8b14-fffa6e1a73b2-kube-api-access-h8z5c\") pod \"frr-k8s-64lgp\" (UID: \"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2\") " pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:00 crc kubenswrapper[4675]: I0320 16:18:00.975828 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqtn\" (UniqueName: \"kubernetes.io/projected/f44b2431-e7ee-41a8-98c0-957a933f8cd8-kube-api-access-vzqtn\") pod \"frr-k8s-webhook-server-bcc4b6f68-jggh8\" (UID: \"f44b2431-e7ee-41a8-98c0-957a933f8cd8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.028035 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.035965 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.054416 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-metrics-certs\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.055232 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbnhm\" (UniqueName: \"kubernetes.io/projected/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-kube-api-access-dbnhm\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.055271 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.054613 4675 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.055304 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmlw\" (UniqueName: \"kubernetes.io/projected/79348741-9f2d-4c18-be91-9a49e0af8ec8-kube-api-access-hkmlw\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.055339 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-cert\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.055397 4675 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.055462 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist podName:79348741-9f2d-4c18-be91-9a49e0af8ec8 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:01.55543986 +0000 UTC m=+1001.589069397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist") pod "speaker-wcb96" (UID: "79348741-9f2d-4c18-be91-9a49e0af8ec8") : secret "metallb-memberlist" not found Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.055492 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-metrics-certs\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.055567 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/79348741-9f2d-4c18-be91-9a49e0af8ec8-metallb-excludel2\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.055668 4675 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.055715 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-metrics-certs podName:f5a8475e-06f7-4340-ba60-cc910fb3e2c5 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:01.555698567 +0000 UTC m=+1001.589328104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-metrics-certs") pod "controller-7bb4cc7c98-phstr" (UID: "f5a8475e-06f7-4340-ba60-cc910fb3e2c5") : secret "controller-certs-secret" not found Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.056350 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/79348741-9f2d-4c18-be91-9a49e0af8ec8-metallb-excludel2\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.056416 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-metrics-certs podName:79348741-9f2d-4c18-be91-9a49e0af8ec8 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:01.556404607 +0000 UTC m=+1001.590034254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-metrics-certs") pod "speaker-wcb96" (UID: "79348741-9f2d-4c18-be91-9a49e0af8ec8") : secret "speaker-certs-secret" not found Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.057759 4675 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.071255 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-cert\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.086397 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmlw\" (UniqueName: \"kubernetes.io/projected/79348741-9f2d-4c18-be91-9a49e0af8ec8-kube-api-access-hkmlw\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.088500 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbnhm\" (UniqueName: \"kubernetes.io/projected/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-kube-api-access-dbnhm\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.464920 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"f5af25a674f771d2c85b1065f724b444534d97780f44567807313a1e08d8198c"} Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.466539 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" event={"ID":"b738426f-49bb-4a73-b55f-c4840a67a7d5","Type":"ContainerStarted","Data":"2a97ce1d2e38b7f79751acfa5a4760e5d967f40d4a2bb035118375c42c1a8c4f"} Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.499192 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8"] Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.563482 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-metrics-certs\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.563590 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-metrics-certs\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.563621 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.563750 4675 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 16:18:01 crc kubenswrapper[4675]: E0320 16:18:01.563824 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist podName:79348741-9f2d-4c18-be91-9a49e0af8ec8 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:02.563808143 +0000 UTC m=+1002.597437680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist") pod "speaker-wcb96" (UID: "79348741-9f2d-4c18-be91-9a49e0af8ec8") : secret "metallb-memberlist" not found Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.572423 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5a8475e-06f7-4340-ba60-cc910fb3e2c5-metrics-certs\") pod \"controller-7bb4cc7c98-phstr\" (UID: \"f5a8475e-06f7-4340-ba60-cc910fb3e2c5\") " pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.574347 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-metrics-certs\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:01 crc kubenswrapper[4675]: I0320 16:18:01.741217 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.198116 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-phstr"] Mar 20 16:18:02 crc kubenswrapper[4675]: W0320 16:18:02.207002 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a8475e_06f7_4340_ba60_cc910fb3e2c5.slice/crio-fd80276839d131b52665098179e18bbb3d114bb18b2618a332126d89e6db616f WatchSource:0}: Error finding container fd80276839d131b52665098179e18bbb3d114bb18b2618a332126d89e6db616f: Status 404 returned error can't find the container with id fd80276839d131b52665098179e18bbb3d114bb18b2618a332126d89e6db616f Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.488097 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-phstr" event={"ID":"f5a8475e-06f7-4340-ba60-cc910fb3e2c5","Type":"ContainerStarted","Data":"540dab328c4a69f0af9145dd3621494e60649f4a399515fbb81e9c89fc258d4c"} Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.488301 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-phstr" event={"ID":"f5a8475e-06f7-4340-ba60-cc910fb3e2c5","Type":"ContainerStarted","Data":"fd80276839d131b52665098179e18bbb3d114bb18b2618a332126d89e6db616f"} Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.489495 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" event={"ID":"f44b2431-e7ee-41a8-98c0-957a933f8cd8","Type":"ContainerStarted","Data":"3eb8bcdbd84190a2d9c5f58bbf734e0a7519eda374796dcdd47d308f15cd5fc5"} Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.577676 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.583628 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79348741-9f2d-4c18-be91-9a49e0af8ec8-memberlist\") pod \"speaker-wcb96\" (UID: \"79348741-9f2d-4c18-be91-9a49e0af8ec8\") " pod="metallb-system/speaker-wcb96" Mar 20 16:18:02 crc kubenswrapper[4675]: I0320 16:18:02.625222 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wcb96" Mar 20 16:18:02 crc kubenswrapper[4675]: W0320 16:18:02.665358 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79348741_9f2d_4c18_be91_9a49e0af8ec8.slice/crio-6c9314dd6f9b0cde4bb93e7790ade9553c8d9b27d0534bf0b19fd048dac36089 WatchSource:0}: Error finding container 6c9314dd6f9b0cde4bb93e7790ade9553c8d9b27d0534bf0b19fd048dac36089: Status 404 returned error can't find the container with id 6c9314dd6f9b0cde4bb93e7790ade9553c8d9b27d0534bf0b19fd048dac36089 Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.497666 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-phstr" event={"ID":"f5a8475e-06f7-4340-ba60-cc910fb3e2c5","Type":"ContainerStarted","Data":"96ae78bed5725dce6f9c7bcf832b24f31b2208f30b75d70597d3670f5d6d2c8c"} Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.498066 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.501336 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wcb96" event={"ID":"79348741-9f2d-4c18-be91-9a49e0af8ec8","Type":"ContainerStarted","Data":"1655d8f742243feffc51526fd621efefd4c38532440ac1ca2b5ee7d6f5bdba44"} Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.501373 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wcb96" event={"ID":"79348741-9f2d-4c18-be91-9a49e0af8ec8","Type":"ContainerStarted","Data":"6fd567e104250161663c245acca7b06f5dd901468306a821466d2295d105084c"} Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.501383 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wcb96" event={"ID":"79348741-9f2d-4c18-be91-9a49e0af8ec8","Type":"ContainerStarted","Data":"6c9314dd6f9b0cde4bb93e7790ade9553c8d9b27d0534bf0b19fd048dac36089"} Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.501534 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wcb96" Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.509200 4675 generic.go:334] "Generic (PLEG): container finished" podID="b738426f-49bb-4a73-b55f-c4840a67a7d5" containerID="a1b063b121dead7b9addc91d2381721786b2495c65eeff023e0eb4174823569b" exitCode=0 Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.509245 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" event={"ID":"b738426f-49bb-4a73-b55f-c4840a67a7d5","Type":"ContainerDied","Data":"a1b063b121dead7b9addc91d2381721786b2495c65eeff023e0eb4174823569b"} Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.512457 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-phstr" podStartSLOduration=3.512439232 podStartE2EDuration="3.512439232s" podCreationTimestamp="2026-03-20 16:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:18:03.512437552 +0000 UTC m=+1003.546067089" watchObservedRunningTime="2026-03-20 16:18:03.512439232 +0000 UTC m=+1003.546068769" Mar 20 16:18:03 crc kubenswrapper[4675]: I0320 16:18:03.544007 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wcb96" podStartSLOduration=3.543989472 podStartE2EDuration="3.543989472s" podCreationTimestamp="2026-03-20 16:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:18:03.541333518 +0000 UTC m=+1003.574963055" watchObservedRunningTime="2026-03-20 16:18:03.543989472 +0000 UTC m=+1003.577619009" Mar 20 16:18:04 crc kubenswrapper[4675]: I0320 16:18:04.425339 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:18:04 crc kubenswrapper[4675]: I0320 16:18:04.425431 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:18:04 crc kubenswrapper[4675]: I0320 16:18:04.425513 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:18:04 crc kubenswrapper[4675]: I0320 16:18:04.426619 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b8ad200c1fd09c2db80eb419a24ccfd9bc395099eb6158f7f572743729d42ad"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:18:04 crc kubenswrapper[4675]: I0320 16:18:04.426790 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://8b8ad200c1fd09c2db80eb419a24ccfd9bc395099eb6158f7f572743729d42ad" gracePeriod=600 Mar 20 16:18:04 crc kubenswrapper[4675]: I0320 16:18:04.919136 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.046977 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpmwv\" (UniqueName: \"kubernetes.io/projected/b738426f-49bb-4a73-b55f-c4840a67a7d5-kube-api-access-tpmwv\") pod \"b738426f-49bb-4a73-b55f-c4840a67a7d5\" (UID: \"b738426f-49bb-4a73-b55f-c4840a67a7d5\") " Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.065836 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b738426f-49bb-4a73-b55f-c4840a67a7d5-kube-api-access-tpmwv" (OuterVolumeSpecName: "kube-api-access-tpmwv") pod "b738426f-49bb-4a73-b55f-c4840a67a7d5" (UID: "b738426f-49bb-4a73-b55f-c4840a67a7d5"). InnerVolumeSpecName "kube-api-access-tpmwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.148255 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpmwv\" (UniqueName: \"kubernetes.io/projected/b738426f-49bb-4a73-b55f-c4840a67a7d5-kube-api-access-tpmwv\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.533594 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="8b8ad200c1fd09c2db80eb419a24ccfd9bc395099eb6158f7f572743729d42ad" exitCode=0 Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.534655 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"8b8ad200c1fd09c2db80eb419a24ccfd9bc395099eb6158f7f572743729d42ad"} Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.534808 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"82c9eee9831702a396f1ab945d5ca7dca1e7cbf4d14cc472240a2d6bc5bec93c"} Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.534906 4675 scope.go:117] "RemoveContainer" containerID="3892cdfbf0fe325e0457d2710b051a0a4b16edb3528b76523a2a7e168e5d772b" Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.539111 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" event={"ID":"b738426f-49bb-4a73-b55f-c4840a67a7d5","Type":"ContainerDied","Data":"2a97ce1d2e38b7f79751acfa5a4760e5d967f40d4a2bb035118375c42c1a8c4f"} Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.539151 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a97ce1d2e38b7f79751acfa5a4760e5d967f40d4a2bb035118375c42c1a8c4f" Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.539201 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-nfwrp" Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.971983 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-d9ghs"] Mar 20 16:18:05 crc kubenswrapper[4675]: I0320 16:18:05.980176 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-d9ghs"] Mar 20 16:18:06 crc kubenswrapper[4675]: I0320 16:18:06.685054 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8" path="/var/lib/kubelet/pods/3df984a1-b77c-4fa4-a9ec-0b13fe5f5ee8/volumes" Mar 20 16:18:09 crc kubenswrapper[4675]: I0320 16:18:09.597844 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" event={"ID":"f44b2431-e7ee-41a8-98c0-957a933f8cd8","Type":"ContainerStarted","Data":"9bea4d7518b72a034cb856ecd361929584ae3b380ebaac14bb05feff08b1f5cb"} Mar 20 16:18:09 crc kubenswrapper[4675]: I0320 16:18:09.598383 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:09 crc kubenswrapper[4675]: I0320 16:18:09.601191 4675 generic.go:334] "Generic (PLEG): container finished" podID="d2ccda47-7e80-4cab-8b14-fffa6e1a73b2" containerID="484775256b3520cf9a6081296586eb8700ed8a757a2038876f7c61b019dafdd4" exitCode=0 Mar 20 16:18:09 crc kubenswrapper[4675]: I0320 16:18:09.601228 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerDied","Data":"484775256b3520cf9a6081296586eb8700ed8a757a2038876f7c61b019dafdd4"} Mar 20 16:18:09 crc kubenswrapper[4675]: I0320 16:18:09.615616 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" podStartSLOduration=2.107993044 podStartE2EDuration="9.615589001s" podCreationTimestamp="2026-03-20 16:18:00 +0000 UTC" firstStartedPulling="2026-03-20 16:18:01.49949342 +0000 UTC m=+1001.533122967" lastFinishedPulling="2026-03-20 16:18:09.007089387 +0000 UTC m=+1009.040718924" observedRunningTime="2026-03-20 16:18:09.614920872 +0000 UTC m=+1009.648550449" watchObservedRunningTime="2026-03-20 16:18:09.615589001 +0000 UTC m=+1009.649218558" Mar 20 16:18:10 crc kubenswrapper[4675]: I0320 16:18:10.609527 4675 generic.go:334] "Generic (PLEG): container finished" podID="d2ccda47-7e80-4cab-8b14-fffa6e1a73b2" containerID="ad4af1426f416b9a2ab26b0a1d65aec04e564e362c17b0febf4fedcfed06c0f6" exitCode=0 Mar 20 16:18:10 crc kubenswrapper[4675]: I0320 16:18:10.609634 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerDied","Data":"ad4af1426f416b9a2ab26b0a1d65aec04e564e362c17b0febf4fedcfed06c0f6"} Mar 20 16:18:11 crc kubenswrapper[4675]: I0320 16:18:11.616953 4675 generic.go:334] "Generic (PLEG): container finished" podID="d2ccda47-7e80-4cab-8b14-fffa6e1a73b2" containerID="70dbda6d6a06df99b498cd5bd338354c4a1a97454613efc9840b53a934bd7a84" exitCode=0 Mar 20 16:18:11 crc kubenswrapper[4675]: I0320 16:18:11.617043 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerDied","Data":"70dbda6d6a06df99b498cd5bd338354c4a1a97454613efc9840b53a934bd7a84"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.626940 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"a922405544f662628c425a4c73ef332a056f5e9ea3a5cc2386af0897fd676230"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628069 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"b1d57dd6a9d8e74dfee48693d5d70e216a91dcdea4160d480de1354956dfef6c"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628090 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628133 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"56b7ffb9429da3d4e6d145f788a741135919a6cf2f1480dd2f2922c78808c4a0"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628147 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"b5ea56474c08e2189b4dc4fa388017aa17133736d5bef443a20503edb56d57e6"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628228 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wcb96" Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628263 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"64ca9c6d315fc51dfe1354f53bdf10332c9a9ed959d28ae199b239c8b33d02d7"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.628276 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64lgp" event={"ID":"d2ccda47-7e80-4cab-8b14-fffa6e1a73b2","Type":"ContainerStarted","Data":"3b4efdf2cb94605efb0d174652c42515c2ce5f9f16c71853ee0c2bb9f8235044"} Mar 20 16:18:12 crc kubenswrapper[4675]: I0320 16:18:12.647566 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-64lgp" podStartSLOduration=4.865544128 podStartE2EDuration="12.647545355s" podCreationTimestamp="2026-03-20 16:18:00 +0000 UTC" firstStartedPulling="2026-03-20 16:18:01.209742902 +0000 UTC m=+1001.243372449" lastFinishedPulling="2026-03-20 16:18:08.991744139 +0000 UTC m=+1009.025373676" observedRunningTime="2026-03-20 16:18:12.645671543 +0000 UTC m=+1012.679301090" watchObservedRunningTime="2026-03-20 16:18:12.647545355 +0000 UTC m=+1012.681174892" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.264599 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5d6fm"] Mar 20 16:18:15 crc kubenswrapper[4675]: E0320 16:18:15.265458 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b738426f-49bb-4a73-b55f-c4840a67a7d5" containerName="oc" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.265475 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b738426f-49bb-4a73-b55f-c4840a67a7d5" containerName="oc" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.265589 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b738426f-49bb-4a73-b55f-c4840a67a7d5" containerName="oc" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.266050 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.269021 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hwsbz" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.269937 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.270508 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.286843 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5d6fm"] Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.388227 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tpp\" (UniqueName: \"kubernetes.io/projected/300b257c-d466-4e5c-bd66-ed1325f1274e-kube-api-access-k5tpp\") pod \"openstack-operator-index-5d6fm\" (UID: \"300b257c-d466-4e5c-bd66-ed1325f1274e\") " pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.489699 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tpp\" (UniqueName: \"kubernetes.io/projected/300b257c-d466-4e5c-bd66-ed1325f1274e-kube-api-access-k5tpp\") pod \"openstack-operator-index-5d6fm\" (UID: \"300b257c-d466-4e5c-bd66-ed1325f1274e\") " pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.507455 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tpp\" (UniqueName: \"kubernetes.io/projected/300b257c-d466-4e5c-bd66-ed1325f1274e-kube-api-access-k5tpp\") pod \"openstack-operator-index-5d6fm\" (UID: \"300b257c-d466-4e5c-bd66-ed1325f1274e\") " pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.583798 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:15 crc kubenswrapper[4675]: I0320 16:18:15.976281 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5d6fm"] Mar 20 16:18:15 crc kubenswrapper[4675]: W0320 16:18:15.980987 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod300b257c_d466_4e5c_bd66_ed1325f1274e.slice/crio-7e3f26f7406b18cf644e99e00c11bf7aad9971448d1e28271cc53591840e0059 WatchSource:0}: Error finding container 7e3f26f7406b18cf644e99e00c11bf7aad9971448d1e28271cc53591840e0059: Status 404 returned error can't find the container with id 7e3f26f7406b18cf644e99e00c11bf7aad9971448d1e28271cc53591840e0059 Mar 20 16:18:16 crc kubenswrapper[4675]: I0320 16:18:16.031092 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:16 crc kubenswrapper[4675]: I0320 16:18:16.068181 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:16 crc kubenswrapper[4675]: I0320 16:18:16.651207 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5d6fm" event={"ID":"300b257c-d466-4e5c-bd66-ed1325f1274e","Type":"ContainerStarted","Data":"7e3f26f7406b18cf644e99e00c11bf7aad9971448d1e28271cc53591840e0059"} Mar 20 16:18:18 crc kubenswrapper[4675]: I0320 16:18:18.634604 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5d6fm"] Mar 20 16:18:18 crc kubenswrapper[4675]: I0320 16:18:18.666621 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5d6fm" event={"ID":"300b257c-d466-4e5c-bd66-ed1325f1274e","Type":"ContainerStarted","Data":"035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba"} Mar 20 16:18:18 crc kubenswrapper[4675]: I0320 16:18:18.688549 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5d6fm" podStartSLOduration=1.472325732 podStartE2EDuration="3.688528731s" podCreationTimestamp="2026-03-20 16:18:15 +0000 UTC" firstStartedPulling="2026-03-20 16:18:15.983706309 +0000 UTC m=+1016.017335846" lastFinishedPulling="2026-03-20 16:18:18.199909308 +0000 UTC m=+1018.233538845" observedRunningTime="2026-03-20 16:18:18.687717469 +0000 UTC m=+1018.721347086" watchObservedRunningTime="2026-03-20 16:18:18.688528731 +0000 UTC m=+1018.722158278" Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.246405 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7lv49"] Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.248102 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.259324 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7lv49"] Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.342734 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shf52\" (UniqueName: \"kubernetes.io/projected/b65b6761-2183-4ab2-9c85-835a172cd2ee-kube-api-access-shf52\") pod \"openstack-operator-index-7lv49\" (UID: \"b65b6761-2183-4ab2-9c85-835a172cd2ee\") " pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.444119 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shf52\" (UniqueName: \"kubernetes.io/projected/b65b6761-2183-4ab2-9c85-835a172cd2ee-kube-api-access-shf52\") pod \"openstack-operator-index-7lv49\" (UID: \"b65b6761-2183-4ab2-9c85-835a172cd2ee\") " pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.478682 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shf52\" (UniqueName: \"kubernetes.io/projected/b65b6761-2183-4ab2-9c85-835a172cd2ee-kube-api-access-shf52\") pod \"openstack-operator-index-7lv49\" (UID: \"b65b6761-2183-4ab2-9c85-835a172cd2ee\") " pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.574515 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.672736 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5d6fm" podUID="300b257c-d466-4e5c-bd66-ed1325f1274e" containerName="registry-server" containerID="cri-o://035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba" gracePeriod=2 Mar 20 16:18:19 crc kubenswrapper[4675]: I0320 16:18:19.986361 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7lv49"] Mar 20 16:18:19 crc kubenswrapper[4675]: W0320 16:18:19.989915 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb65b6761_2183_4ab2_9c85_835a172cd2ee.slice/crio-530a182662072e7491af756d8bade28bfd9b10c990f783dd97faab42dad7a759 WatchSource:0}: Error finding container 530a182662072e7491af756d8bade28bfd9b10c990f783dd97faab42dad7a759: Status 404 returned error can't find the container with id 530a182662072e7491af756d8bade28bfd9b10c990f783dd97faab42dad7a759 Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.007959 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.051200 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5tpp\" (UniqueName: \"kubernetes.io/projected/300b257c-d466-4e5c-bd66-ed1325f1274e-kube-api-access-k5tpp\") pod \"300b257c-d466-4e5c-bd66-ed1325f1274e\" (UID: \"300b257c-d466-4e5c-bd66-ed1325f1274e\") " Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.056106 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300b257c-d466-4e5c-bd66-ed1325f1274e-kube-api-access-k5tpp" (OuterVolumeSpecName: "kube-api-access-k5tpp") pod "300b257c-d466-4e5c-bd66-ed1325f1274e" (UID: "300b257c-d466-4e5c-bd66-ed1325f1274e"). InnerVolumeSpecName "kube-api-access-k5tpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.154545 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5tpp\" (UniqueName: \"kubernetes.io/projected/300b257c-d466-4e5c-bd66-ed1325f1274e-kube-api-access-k5tpp\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.683720 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lv49" event={"ID":"b65b6761-2183-4ab2-9c85-835a172cd2ee","Type":"ContainerStarted","Data":"9414cc9f73bd9fbed8750b3341ef2aa96608abbfb1b80c3dc0abe9306b8976f3"} Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.683807 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lv49" event={"ID":"b65b6761-2183-4ab2-9c85-835a172cd2ee","Type":"ContainerStarted","Data":"530a182662072e7491af756d8bade28bfd9b10c990f783dd97faab42dad7a759"} Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.686464 4675 generic.go:334] "Generic (PLEG): container finished" podID="300b257c-d466-4e5c-bd66-ed1325f1274e" containerID="035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba" exitCode=0 Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.686535 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5d6fm" event={"ID":"300b257c-d466-4e5c-bd66-ed1325f1274e","Type":"ContainerDied","Data":"035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba"} Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.686574 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5d6fm" event={"ID":"300b257c-d466-4e5c-bd66-ed1325f1274e","Type":"ContainerDied","Data":"7e3f26f7406b18cf644e99e00c11bf7aad9971448d1e28271cc53591840e0059"} Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.686614 4675 scope.go:117] "RemoveContainer" containerID="035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.686752 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5d6fm" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.711279 4675 scope.go:117] "RemoveContainer" containerID="035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba" Mar 20 16:18:20 crc kubenswrapper[4675]: E0320 16:18:20.712040 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba\": container with ID starting with 035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba not found: ID does not exist" containerID="035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.712087 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba"} err="failed to get container status \"035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba\": rpc error: code = NotFound desc = could not find container \"035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba\": container with ID starting with 035b2844696d68f17d1dc0e76353a7265124f55b0dbeaac4b337c5aba69c34ba not found: ID does not exist" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.718720 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7lv49" podStartSLOduration=1.661957091 podStartE2EDuration="1.718697013s" podCreationTimestamp="2026-03-20 16:18:19 +0000 UTC" firstStartedPulling="2026-03-20 16:18:19.99474746 +0000 UTC m=+1020.028376997" lastFinishedPulling="2026-03-20 16:18:20.051487382 +0000 UTC m=+1020.085116919" observedRunningTime="2026-03-20 16:18:20.713241321 +0000 UTC m=+1020.746870858" watchObservedRunningTime="2026-03-20 16:18:20.718697013 +0000 UTC m=+1020.752326560" Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.731200 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5d6fm"] Mar 20 16:18:20 crc kubenswrapper[4675]: I0320 16:18:20.739449 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5d6fm"] Mar 20 16:18:21 crc kubenswrapper[4675]: I0320 16:18:21.031500 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-64lgp" Mar 20 16:18:21 crc kubenswrapper[4675]: I0320 16:18:21.040432 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jggh8" Mar 20 16:18:21 crc kubenswrapper[4675]: I0320 16:18:21.745197 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-phstr" Mar 20 16:18:22 crc kubenswrapper[4675]: I0320 16:18:22.682517 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300b257c-d466-4e5c-bd66-ed1325f1274e" path="/var/lib/kubelet/pods/300b257c-d466-4e5c-bd66-ed1325f1274e/volumes" Mar 20 16:18:29 crc kubenswrapper[4675]: I0320 16:18:29.574646 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:29 crc kubenswrapper[4675]: I0320 16:18:29.575164 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:29 crc kubenswrapper[4675]: I0320 16:18:29.615320 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:29 crc kubenswrapper[4675]: I0320 16:18:29.786990 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7lv49" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.487283 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4"] Mar 20 16:18:30 crc kubenswrapper[4675]: E0320 16:18:30.487684 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300b257c-d466-4e5c-bd66-ed1325f1274e" containerName="registry-server" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.487711 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="300b257c-d466-4e5c-bd66-ed1325f1274e" containerName="registry-server" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.487942 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="300b257c-d466-4e5c-bd66-ed1325f1274e" containerName="registry-server" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.489487 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.492127 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6spbq" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.493870 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4"] Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.594599 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v25fs\" (UniqueName: \"kubernetes.io/projected/3f783bd0-ccf7-4dab-9412-6f2b1a943424-kube-api-access-v25fs\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.594660 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-bundle\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.594756 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-util\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.695522 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-util\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.695640 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v25fs\" (UniqueName: \"kubernetes.io/projected/3f783bd0-ccf7-4dab-9412-6f2b1a943424-kube-api-access-v25fs\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.695674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-bundle\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.696194 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-bundle\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.696581 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-util\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.719317 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v25fs\" (UniqueName: \"kubernetes.io/projected/3f783bd0-ccf7-4dab-9412-6f2b1a943424-kube-api-access-v25fs\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:30 crc kubenswrapper[4675]: I0320 16:18:30.815530 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:31 crc kubenswrapper[4675]: I0320 16:18:31.281103 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4"] Mar 20 16:18:31 crc kubenswrapper[4675]: W0320 16:18:31.291097 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f783bd0_ccf7_4dab_9412_6f2b1a943424.slice/crio-e9756dcc8cc87adae8aee3d9c8131a5cd6c56a480e17a3006063b65bcc619915 WatchSource:0}: Error finding container e9756dcc8cc87adae8aee3d9c8131a5cd6c56a480e17a3006063b65bcc619915: Status 404 returned error can't find the container with id e9756dcc8cc87adae8aee3d9c8131a5cd6c56a480e17a3006063b65bcc619915 Mar 20 16:18:31 crc kubenswrapper[4675]: I0320 16:18:31.767632 4675 generic.go:334] "Generic (PLEG): container finished" podID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerID="52864d6ba4820a2958d95c810c64f4f24a01b7fb4cc2c099032d89a6ede29079" exitCode=0 Mar 20 16:18:31 crc kubenswrapper[4675]: I0320 16:18:31.767673 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" event={"ID":"3f783bd0-ccf7-4dab-9412-6f2b1a943424","Type":"ContainerDied","Data":"52864d6ba4820a2958d95c810c64f4f24a01b7fb4cc2c099032d89a6ede29079"} Mar 20 16:18:31 crc kubenswrapper[4675]: I0320 16:18:31.767705 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" event={"ID":"3f783bd0-ccf7-4dab-9412-6f2b1a943424","Type":"ContainerStarted","Data":"e9756dcc8cc87adae8aee3d9c8131a5cd6c56a480e17a3006063b65bcc619915"} Mar 20 16:18:34 crc kubenswrapper[4675]: I0320 16:18:34.791592 4675 generic.go:334] "Generic (PLEG): container finished" podID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerID="68849007d7bddd9b0c7fbbf28211964dfbd5138f66e781baccbcaf1fa8eb1519" exitCode=0 Mar 20 16:18:34 crc kubenswrapper[4675]: I0320 16:18:34.791747 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" event={"ID":"3f783bd0-ccf7-4dab-9412-6f2b1a943424","Type":"ContainerDied","Data":"68849007d7bddd9b0c7fbbf28211964dfbd5138f66e781baccbcaf1fa8eb1519"} Mar 20 16:18:35 crc kubenswrapper[4675]: I0320 16:18:35.802955 4675 generic.go:334] "Generic (PLEG): container finished" podID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerID="5809906cfb5683775f986e57be43a4dc11107393b0b87df9816491666ce72211" exitCode=0 Mar 20 16:18:35 crc kubenswrapper[4675]: I0320 16:18:35.803008 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" event={"ID":"3f783bd0-ccf7-4dab-9412-6f2b1a943424","Type":"ContainerDied","Data":"5809906cfb5683775f986e57be43a4dc11107393b0b87df9816491666ce72211"} Mar 20 16:18:36 crc kubenswrapper[4675]: I0320 16:18:36.583857 4675 scope.go:117] "RemoveContainer" containerID="e5fbb67eea72da425118a53beb69fd39d85cd3cf4d1d9dbfb46c92cb55d56770" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.090225 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.179180 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v25fs\" (UniqueName: \"kubernetes.io/projected/3f783bd0-ccf7-4dab-9412-6f2b1a943424-kube-api-access-v25fs\") pod \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.179232 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-bundle\") pod \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.179272 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-util\") pod \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\" (UID: \"3f783bd0-ccf7-4dab-9412-6f2b1a943424\") " Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.180250 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-bundle" (OuterVolumeSpecName: "bundle") pod "3f783bd0-ccf7-4dab-9412-6f2b1a943424" (UID: "3f783bd0-ccf7-4dab-9412-6f2b1a943424"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.189980 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-util" (OuterVolumeSpecName: "util") pod "3f783bd0-ccf7-4dab-9412-6f2b1a943424" (UID: "3f783bd0-ccf7-4dab-9412-6f2b1a943424"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.192063 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f783bd0-ccf7-4dab-9412-6f2b1a943424-kube-api-access-v25fs" (OuterVolumeSpecName: "kube-api-access-v25fs") pod "3f783bd0-ccf7-4dab-9412-6f2b1a943424" (UID: "3f783bd0-ccf7-4dab-9412-6f2b1a943424"). InnerVolumeSpecName "kube-api-access-v25fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.281202 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v25fs\" (UniqueName: \"kubernetes.io/projected/3f783bd0-ccf7-4dab-9412-6f2b1a943424-kube-api-access-v25fs\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.281255 4675 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.281264 4675 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f783bd0-ccf7-4dab-9412-6f2b1a943424-util\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.824547 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" event={"ID":"3f783bd0-ccf7-4dab-9412-6f2b1a943424","Type":"ContainerDied","Data":"e9756dcc8cc87adae8aee3d9c8131a5cd6c56a480e17a3006063b65bcc619915"} Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.825112 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9756dcc8cc87adae8aee3d9c8131a5cd6c56a480e17a3006063b65bcc619915" Mar 20 16:18:37 crc kubenswrapper[4675]: I0320 16:18:37.824628 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.006226 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2"] Mar 20 16:18:43 crc kubenswrapper[4675]: E0320 16:18:43.006978 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="util" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.006992 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="util" Mar 20 16:18:43 crc kubenswrapper[4675]: E0320 16:18:43.007008 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="pull" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.007013 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="pull" Mar 20 16:18:43 crc kubenswrapper[4675]: E0320 16:18:43.007030 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="extract" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.007036 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="extract" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.007157 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f783bd0-ccf7-4dab-9412-6f2b1a943424" containerName="extract" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.007555 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.012286 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-v4hqt" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.032834 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2"] Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.061218 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58lz\" (UniqueName: \"kubernetes.io/projected/7dfe43c8-9d92-424d-8f75-f4afffd29901-kube-api-access-r58lz\") pod \"openstack-operator-controller-init-94465cd74-nrsq2\" (UID: \"7dfe43c8-9d92-424d-8f75-f4afffd29901\") " pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.162342 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58lz\" (UniqueName: \"kubernetes.io/projected/7dfe43c8-9d92-424d-8f75-f4afffd29901-kube-api-access-r58lz\") pod \"openstack-operator-controller-init-94465cd74-nrsq2\" (UID: \"7dfe43c8-9d92-424d-8f75-f4afffd29901\") " pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.197110 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58lz\" (UniqueName: \"kubernetes.io/projected/7dfe43c8-9d92-424d-8f75-f4afffd29901-kube-api-access-r58lz\") pod \"openstack-operator-controller-init-94465cd74-nrsq2\" (UID: \"7dfe43c8-9d92-424d-8f75-f4afffd29901\") " pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.327741 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.607412 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2"] Mar 20 16:18:43 crc kubenswrapper[4675]: I0320 16:18:43.880546 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" event={"ID":"7dfe43c8-9d92-424d-8f75-f4afffd29901","Type":"ContainerStarted","Data":"f6483fa2bde91ed50359c39f23d4c05f219f1c46c1dc9d7cfead1f4f59a0b685"} Mar 20 16:18:47 crc kubenswrapper[4675]: I0320 16:18:47.925977 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" event={"ID":"7dfe43c8-9d92-424d-8f75-f4afffd29901","Type":"ContainerStarted","Data":"273aaa49b056f1ef12aa26ffd755dcdfc65d662740c960cb9dbc11c810c38bc4"} Mar 20 16:18:47 crc kubenswrapper[4675]: I0320 16:18:47.926598 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:18:47 crc kubenswrapper[4675]: I0320 16:18:47.965692 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" podStartSLOduration=2.276669621 podStartE2EDuration="5.965666724s" podCreationTimestamp="2026-03-20 16:18:42 +0000 UTC" firstStartedPulling="2026-03-20 16:18:43.617718366 +0000 UTC m=+1043.651347903" lastFinishedPulling="2026-03-20 16:18:47.306715459 +0000 UTC m=+1047.340345006" observedRunningTime="2026-03-20 16:18:47.959513662 +0000 UTC m=+1047.993143239" watchObservedRunningTime="2026-03-20 16:18:47.965666724 +0000 UTC m=+1047.999296301" Mar 20 16:18:53 crc kubenswrapper[4675]: I0320 16:18:53.331893 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-94465cd74-nrsq2" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.391818 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.405261 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.408295 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-m2nln" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.434965 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.436574 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.441202 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jsb2p" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.441971 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.449989 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.458065 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.458970 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.463860 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.464802 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-666bv" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.475471 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.476260 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.479760 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-49r8s" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.481596 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.508030 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.508858 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.515805 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.515949 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7j2bb" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.528624 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.534550 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.542188 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-44vjl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.550716 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.556514 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qtv\" (UniqueName: \"kubernetes.io/projected/19624364-2dec-43f0-961c-12c5071289fd-kube-api-access-q7qtv\") pod \"barbican-operator-controller-manager-59bc569d95-jrphq\" (UID: \"19624364-2dec-43f0-961c-12c5071289fd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.556581 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7c2q\" (UniqueName: \"kubernetes.io/projected/56bf5935-2a04-4182-8db7-8b98736a96fa-kube-api-access-s7c2q\") pod \"cinder-operator-controller-manager-8d58dc466-c6j4l\" (UID: \"56bf5935-2a04-4182-8db7-8b98736a96fa\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.558082 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.558868 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.563518 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.563857 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t4md6" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.576826 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.577621 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.581363 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mfhxh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.593957 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.596181 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.597470 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.597749 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4lrk2" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.598232 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.603123 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vkn67" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.604864 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.630127 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.651061 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661063 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgkg4\" (UniqueName: \"kubernetes.io/projected/597a22d0-7193-41e5-8312-8cc9aa9a29a8-kube-api-access-mgkg4\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661119 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d-kube-api-access-4b8tx\") pod \"horizon-operator-controller-manager-8464cc45fb-vrwzl\" (UID: \"fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661144 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661172 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrj5\" (UniqueName: \"kubernetes.io/projected/5ab24f7d-1842-40a4-8ab1-a0299644ecd5-kube-api-access-thrj5\") pod \"glance-operator-controller-manager-79df6bcc97-k6lwh\" (UID: \"5ab24f7d-1842-40a4-8ab1-a0299644ecd5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661237 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khv4\" (UniqueName: \"kubernetes.io/projected/1c0ae0a7-9969-47b3-871f-536af2bd1784-kube-api-access-4khv4\") pod \"keystone-operator-controller-manager-768b96df4c-fpwf9\" (UID: \"1c0ae0a7-9969-47b3-871f-536af2bd1784\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661380 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qtv\" (UniqueName: \"kubernetes.io/projected/19624364-2dec-43f0-961c-12c5071289fd-kube-api-access-q7qtv\") pod \"barbican-operator-controller-manager-59bc569d95-jrphq\" (UID: \"19624364-2dec-43f0-961c-12c5071289fd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661536 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g2l2\" (UniqueName: \"kubernetes.io/projected/5ce8a4db-9cfd-45da-82be-597b6f3b1257-kube-api-access-9g2l2\") pod \"ironic-operator-controller-manager-6f787dddc9-gvrbv\" (UID: \"5ce8a4db-9cfd-45da-82be-597b6f3b1257\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661561 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhqpc\" (UniqueName: \"kubernetes.io/projected/ec2c3dde-b80d-4baa-a092-c38d978c7c4e-kube-api-access-mhqpc\") pod \"designate-operator-controller-manager-588d4d986b-ktbv8\" (UID: \"ec2c3dde-b80d-4baa-a092-c38d978c7c4e\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661593 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7c2q\" (UniqueName: \"kubernetes.io/projected/56bf5935-2a04-4182-8db7-8b98736a96fa-kube-api-access-s7c2q\") pod \"cinder-operator-controller-manager-8d58dc466-c6j4l\" (UID: \"56bf5935-2a04-4182-8db7-8b98736a96fa\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661621 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt77t\" (UniqueName: \"kubernetes.io/projected/26f79a1c-7e90-4c87-8027-4044ec669321-kube-api-access-bt77t\") pod \"heat-operator-controller-manager-67dd5f86f5-m6dhl\" (UID: \"26f79a1c-7e90-4c87-8027-4044ec669321\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.661725 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6rhn\" (UniqueName: \"kubernetes.io/projected/34bb9539-fd7b-49da-99dd-548e9e8de389-kube-api-access-w6rhn\") pod \"manila-operator-controller-manager-55f864c847-tg6fk\" (UID: \"34bb9539-fd7b-49da-99dd-548e9e8de389\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.666577 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.667595 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.669657 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-26zcp" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.684923 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7c2q\" (UniqueName: \"kubernetes.io/projected/56bf5935-2a04-4182-8db7-8b98736a96fa-kube-api-access-s7c2q\") pod \"cinder-operator-controller-manager-8d58dc466-c6j4l\" (UID: \"56bf5935-2a04-4182-8db7-8b98736a96fa\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.691632 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qtv\" (UniqueName: \"kubernetes.io/projected/19624364-2dec-43f0-961c-12c5071289fd-kube-api-access-q7qtv\") pod \"barbican-operator-controller-manager-59bc569d95-jrphq\" (UID: \"19624364-2dec-43f0-961c-12c5071289fd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.698213 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.701868 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.719745 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-msv57"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.720507 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.724960 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wpbdh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.732892 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.733605 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-msv57"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.739811 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.743108 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.761836 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.762112 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kgmrh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.762888 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d-kube-api-access-4b8tx\") pod \"horizon-operator-controller-manager-8464cc45fb-vrwzl\" (UID: \"fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.762914 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.762932 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrj5\" (UniqueName: \"kubernetes.io/projected/5ab24f7d-1842-40a4-8ab1-a0299644ecd5-kube-api-access-thrj5\") pod \"glance-operator-controller-manager-79df6bcc97-k6lwh\" (UID: \"5ab24f7d-1842-40a4-8ab1-a0299644ecd5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.762952 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdgz\" (UniqueName: \"kubernetes.io/projected/c4f8d16f-3951-48ab-8525-79be59c6d957-kube-api-access-wrdgz\") pod \"neutron-operator-controller-manager-767865f676-msv57\" (UID: \"c4f8d16f-3951-48ab-8525-79be59c6d957\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.762973 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khv4\" (UniqueName: \"kubernetes.io/projected/1c0ae0a7-9969-47b3-871f-536af2bd1784-kube-api-access-4khv4\") pod \"keystone-operator-controller-manager-768b96df4c-fpwf9\" (UID: \"1c0ae0a7-9969-47b3-871f-536af2bd1784\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763006 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763014 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g2l2\" (UniqueName: \"kubernetes.io/projected/5ce8a4db-9cfd-45da-82be-597b6f3b1257-kube-api-access-9g2l2\") pod \"ironic-operator-controller-manager-6f787dddc9-gvrbv\" (UID: \"5ce8a4db-9cfd-45da-82be-597b6f3b1257\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763041 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhqpc\" (UniqueName: \"kubernetes.io/projected/ec2c3dde-b80d-4baa-a092-c38d978c7c4e-kube-api-access-mhqpc\") pod \"designate-operator-controller-manager-588d4d986b-ktbv8\" (UID: \"ec2c3dde-b80d-4baa-a092-c38d978c7c4e\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763120 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt77t\" (UniqueName: \"kubernetes.io/projected/26f79a1c-7e90-4c87-8027-4044ec669321-kube-api-access-bt77t\") pod \"heat-operator-controller-manager-67dd5f86f5-m6dhl\" (UID: \"26f79a1c-7e90-4c87-8027-4044ec669321\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763158 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cst9\" (UniqueName: \"kubernetes.io/projected/5718414d-60fc-4af8-a8aa-46a12b8114ab-kube-api-access-2cst9\") pod \"mariadb-operator-controller-manager-67ccfc9778-q4k6l\" (UID: \"5718414d-60fc-4af8-a8aa-46a12b8114ab\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763193 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbspp\" (UniqueName: \"kubernetes.io/projected/14e35dfe-49a2-4d89-9404-3ef0311be41e-kube-api-access-bbspp\") pod \"nova-operator-controller-manager-5d488d59fb-fvx4c\" (UID: \"14e35dfe-49a2-4d89-9404-3ef0311be41e\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763378 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6rhn\" (UniqueName: \"kubernetes.io/projected/34bb9539-fd7b-49da-99dd-548e9e8de389-kube-api-access-w6rhn\") pod \"manila-operator-controller-manager-55f864c847-tg6fk\" (UID: \"34bb9539-fd7b-49da-99dd-548e9e8de389\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.763544 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgkg4\" (UniqueName: \"kubernetes.io/projected/597a22d0-7193-41e5-8312-8cc9aa9a29a8-kube-api-access-mgkg4\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:11 crc kubenswrapper[4675]: E0320 16:19:11.764908 4675 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:11 crc kubenswrapper[4675]: E0320 16:19:11.764980 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert podName:597a22d0-7193-41e5-8312-8cc9aa9a29a8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:12.264959267 +0000 UTC m=+1072.298588904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert") pod "infra-operator-controller-manager-8d4c8954d-flsfk" (UID: "597a22d0-7193-41e5-8312-8cc9aa9a29a8") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.766997 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-887wp" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.771403 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.779297 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.788927 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhqpc\" (UniqueName: \"kubernetes.io/projected/ec2c3dde-b80d-4baa-a092-c38d978c7c4e-kube-api-access-mhqpc\") pod \"designate-operator-controller-manager-588d4d986b-ktbv8\" (UID: \"ec2c3dde-b80d-4baa-a092-c38d978c7c4e\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.792785 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrj5\" (UniqueName: \"kubernetes.io/projected/5ab24f7d-1842-40a4-8ab1-a0299644ecd5-kube-api-access-thrj5\") pod \"glance-operator-controller-manager-79df6bcc97-k6lwh\" (UID: \"5ab24f7d-1842-40a4-8ab1-a0299644ecd5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.793844 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.795817 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6rhn\" (UniqueName: \"kubernetes.io/projected/34bb9539-fd7b-49da-99dd-548e9e8de389-kube-api-access-w6rhn\") pod \"manila-operator-controller-manager-55f864c847-tg6fk\" (UID: \"34bb9539-fd7b-49da-99dd-548e9e8de389\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.799452 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt77t\" (UniqueName: \"kubernetes.io/projected/26f79a1c-7e90-4c87-8027-4044ec669321-kube-api-access-bt77t\") pod \"heat-operator-controller-manager-67dd5f86f5-m6dhl\" (UID: \"26f79a1c-7e90-4c87-8027-4044ec669321\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.802298 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d-kube-api-access-4b8tx\") pod \"horizon-operator-controller-manager-8464cc45fb-vrwzl\" (UID: \"fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.804154 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgkg4\" (UniqueName: \"kubernetes.io/projected/597a22d0-7193-41e5-8312-8cc9aa9a29a8-kube-api-access-mgkg4\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.805612 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khv4\" (UniqueName: \"kubernetes.io/projected/1c0ae0a7-9969-47b3-871f-536af2bd1784-kube-api-access-4khv4\") pod \"keystone-operator-controller-manager-768b96df4c-fpwf9\" (UID: \"1c0ae0a7-9969-47b3-871f-536af2bd1784\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.812883 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.825497 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.825837 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.827616 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.829623 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-t5cmw" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.830850 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g2l2\" (UniqueName: \"kubernetes.io/projected/5ce8a4db-9cfd-45da-82be-597b6f3b1257-kube-api-access-9g2l2\") pod \"ironic-operator-controller-manager-6f787dddc9-gvrbv\" (UID: \"5ce8a4db-9cfd-45da-82be-597b6f3b1257\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.832215 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.835000 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.842212 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.851704 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.852412 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cf9gq" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.852542 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.873893 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdgz\" (UniqueName: \"kubernetes.io/projected/c4f8d16f-3951-48ab-8525-79be59c6d957-kube-api-access-wrdgz\") pod \"neutron-operator-controller-manager-767865f676-msv57\" (UID: \"c4f8d16f-3951-48ab-8525-79be59c6d957\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.874011 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cst9\" (UniqueName: \"kubernetes.io/projected/5718414d-60fc-4af8-a8aa-46a12b8114ab-kube-api-access-2cst9\") pod \"mariadb-operator-controller-manager-67ccfc9778-q4k6l\" (UID: \"5718414d-60fc-4af8-a8aa-46a12b8114ab\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.874066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbspp\" (UniqueName: \"kubernetes.io/projected/14e35dfe-49a2-4d89-9404-3ef0311be41e-kube-api-access-bbspp\") pod \"nova-operator-controller-manager-5d488d59fb-fvx4c\" (UID: \"14e35dfe-49a2-4d89-9404-3ef0311be41e\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.874561 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.902989 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdgz\" (UniqueName: \"kubernetes.io/projected/c4f8d16f-3951-48ab-8525-79be59c6d957-kube-api-access-wrdgz\") pod \"neutron-operator-controller-manager-767865f676-msv57\" (UID: \"c4f8d16f-3951-48ab-8525-79be59c6d957\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.909353 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cst9\" (UniqueName: \"kubernetes.io/projected/5718414d-60fc-4af8-a8aa-46a12b8114ab-kube-api-access-2cst9\") pod \"mariadb-operator-controller-manager-67ccfc9778-q4k6l\" (UID: \"5718414d-60fc-4af8-a8aa-46a12b8114ab\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.909885 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.910716 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbspp\" (UniqueName: \"kubernetes.io/projected/14e35dfe-49a2-4d89-9404-3ef0311be41e-kube-api-access-bbspp\") pod \"nova-operator-controller-manager-5d488d59fb-fvx4c\" (UID: \"14e35dfe-49a2-4d89-9404-3ef0311be41e\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.911328 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.911487 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.913641 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dds6x" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.919136 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.920030 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.921554 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-m5khd" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.928114 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.934659 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.945280 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.953121 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9"] Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.954593 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.983113 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.983149 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4b5k\" (UniqueName: \"kubernetes.io/projected/427a4fcc-2562-4dc5-8735-0f6a448533ab-kube-api-access-r4b5k\") pod \"octavia-operator-controller-manager-5b9f45d989-bk29d\" (UID: \"427a4fcc-2562-4dc5-8735-0f6a448533ab\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.983170 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m568q\" (UniqueName: \"kubernetes.io/projected/4a01ec4a-d453-44a0-ae13-ef8607f3ccb3-kube-api-access-m568q\") pod \"ovn-operator-controller-manager-884679f54-hrp5b\" (UID: \"4a01ec4a-d453-44a0-ae13-ef8607f3ccb3\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.983188 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5l5n\" (UniqueName: \"kubernetes.io/projected/45153c87-e6ea-4463-9a37-4bc2805530f8-kube-api-access-k5l5n\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.983219 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pznqc\" (UniqueName: \"kubernetes.io/projected/442371dc-c0af-48c8-83fa-01012b636590-kube-api-access-pznqc\") pod \"placement-operator-controller-manager-5784578c99-mtkg9\" (UID: \"442371dc-c0af-48c8-83fa-01012b636590\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.983261 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdtl\" (UniqueName: \"kubernetes.io/projected/1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7-kube-api-access-wpdtl\") pod \"swift-operator-controller-manager-c674c5965-jh8ng\" (UID: \"1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:11 crc kubenswrapper[4675]: I0320 16:19:11.992277 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.006900 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.008539 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.017294 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2rcqx" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.028589 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.054714 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.060843 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.061672 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.064919 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-tzxgk" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.065343 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.084924 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdtl\" (UniqueName: \"kubernetes.io/projected/1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7-kube-api-access-wpdtl\") pod \"swift-operator-controller-manager-c674c5965-jh8ng\" (UID: \"1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.084977 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kk57\" (UniqueName: \"kubernetes.io/projected/9e05e473-0f8c-41a4-8b84-2c0c0d867f90-kube-api-access-8kk57\") pod \"telemetry-operator-controller-manager-d6b694c5-zngbt\" (UID: \"9e05e473-0f8c-41a4-8b84-2c0c0d867f90\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.085046 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drnbj\" (UniqueName: \"kubernetes.io/projected/9e316228-bcfc-4362-9786-67097a7b0730-kube-api-access-drnbj\") pod \"test-operator-controller-manager-5c5cb9c4d7-tphgw\" (UID: \"9e316228-bcfc-4362-9786-67097a7b0730\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.085117 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.085151 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4b5k\" (UniqueName: \"kubernetes.io/projected/427a4fcc-2562-4dc5-8735-0f6a448533ab-kube-api-access-r4b5k\") pod \"octavia-operator-controller-manager-5b9f45d989-bk29d\" (UID: \"427a4fcc-2562-4dc5-8735-0f6a448533ab\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.085216 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m568q\" (UniqueName: \"kubernetes.io/projected/4a01ec4a-d453-44a0-ae13-ef8607f3ccb3-kube-api-access-m568q\") pod \"ovn-operator-controller-manager-884679f54-hrp5b\" (UID: \"4a01ec4a-d453-44a0-ae13-ef8607f3ccb3\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.085261 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5l5n\" (UniqueName: \"kubernetes.io/projected/45153c87-e6ea-4463-9a37-4bc2805530f8-kube-api-access-k5l5n\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.085357 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pznqc\" (UniqueName: \"kubernetes.io/projected/442371dc-c0af-48c8-83fa-01012b636590-kube-api-access-pznqc\") pod \"placement-operator-controller-manager-5784578c99-mtkg9\" (UID: \"442371dc-c0af-48c8-83fa-01012b636590\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.085639 4675 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.085756 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert podName:45153c87-e6ea-4463-9a37-4bc2805530f8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:12.585715416 +0000 UTC m=+1072.619345033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" (UID: "45153c87-e6ea-4463-9a37-4bc2805530f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.123650 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m568q\" (UniqueName: \"kubernetes.io/projected/4a01ec4a-d453-44a0-ae13-ef8607f3ccb3-kube-api-access-m568q\") pod \"ovn-operator-controller-manager-884679f54-hrp5b\" (UID: \"4a01ec4a-d453-44a0-ae13-ef8607f3ccb3\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.124482 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.125219 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdtl\" (UniqueName: \"kubernetes.io/projected/1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7-kube-api-access-wpdtl\") pod \"swift-operator-controller-manager-c674c5965-jh8ng\" (UID: \"1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.125983 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.129836 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-chmkw" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.136475 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4b5k\" (UniqueName: \"kubernetes.io/projected/427a4fcc-2562-4dc5-8735-0f6a448533ab-kube-api-access-r4b5k\") pod \"octavia-operator-controller-manager-5b9f45d989-bk29d\" (UID: \"427a4fcc-2562-4dc5-8735-0f6a448533ab\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.136554 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.138784 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5l5n\" (UniqueName: \"kubernetes.io/projected/45153c87-e6ea-4463-9a37-4bc2805530f8-kube-api-access-k5l5n\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.159394 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pznqc\" (UniqueName: \"kubernetes.io/projected/442371dc-c0af-48c8-83fa-01012b636590-kube-api-access-pznqc\") pod \"placement-operator-controller-manager-5784578c99-mtkg9\" (UID: \"442371dc-c0af-48c8-83fa-01012b636590\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.165757 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.166815 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.168939 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-c57kx" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.169142 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.169292 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.172259 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.180539 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd"] Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.188998 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kk57\" (UniqueName: \"kubernetes.io/projected/9e05e473-0f8c-41a4-8b84-2c0c0d867f90-kube-api-access-8kk57\") pod \"telemetry-operator-controller-manager-d6b694c5-zngbt\" (UID: \"9e05e473-0f8c-41a4-8b84-2c0c0d867f90\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.189316 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drnbj\" (UniqueName: \"kubernetes.io/projected/9e316228-bcfc-4362-9786-67097a7b0730-kube-api-access-drnbj\") pod \"test-operator-controller-manager-5c5cb9c4d7-tphgw\" (UID: \"9e316228-bcfc-4362-9786-67097a7b0730\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.189426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.189519 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.189656 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8rh7\" (UniqueName: \"kubernetes.io/projected/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-kube-api-access-f8rh7\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.189873 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpkbd\" (UniqueName: \"kubernetes.io/projected/762b742c-1a1d-4a82-8f60-b71e9fe44637-kube-api-access-xpkbd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-bbdh2\" (UID: \"762b742c-1a1d-4a82-8f60-b71e9fe44637\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.206606 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.235086 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.257094 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.260439 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnbj\" (UniqueName: \"kubernetes.io/projected/9e316228-bcfc-4362-9786-67097a7b0730-kube-api-access-drnbj\") pod \"test-operator-controller-manager-5c5cb9c4d7-tphgw\" (UID: \"9e316228-bcfc-4362-9786-67097a7b0730\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.263432 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kk57\" (UniqueName: \"kubernetes.io/projected/9e05e473-0f8c-41a4-8b84-2c0c0d867f90-kube-api-access-8kk57\") pod \"telemetry-operator-controller-manager-d6b694c5-zngbt\" (UID: \"9e05e473-0f8c-41a4-8b84-2c0c0d867f90\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.291577 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.292118 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.292223 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8rh7\" (UniqueName: \"kubernetes.io/projected/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-kube-api-access-f8rh7\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.292315 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpkbd\" (UniqueName: \"kubernetes.io/projected/762b742c-1a1d-4a82-8f60-b71e9fe44637-kube-api-access-xpkbd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-bbdh2\" (UID: \"762b742c-1a1d-4a82-8f60-b71e9fe44637\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.292432 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.292556 4675 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.292629 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:12.792609757 +0000 UTC m=+1072.826239294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.292824 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.292892 4675 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.292922 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:12.792914606 +0000 UTC m=+1072.826544143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "metrics-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.293455 4675 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.293559 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert podName:597a22d0-7193-41e5-8312-8cc9aa9a29a8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:13.293543674 +0000 UTC m=+1073.327173201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert") pod "infra-operator-controller-manager-8d4c8954d-flsfk" (UID: "597a22d0-7193-41e5-8312-8cc9aa9a29a8") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.325552 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpkbd\" (UniqueName: \"kubernetes.io/projected/762b742c-1a1d-4a82-8f60-b71e9fe44637-kube-api-access-xpkbd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-bbdh2\" (UID: \"762b742c-1a1d-4a82-8f60-b71e9fe44637\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.336615 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8rh7\" (UniqueName: \"kubernetes.io/projected/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-kube-api-access-f8rh7\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.356949 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.368873 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.404261 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.599653 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.599805 4675 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.599863 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert podName:45153c87-e6ea-4463-9a37-4bc2805530f8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:13.599843938 +0000 UTC m=+1073.633473475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" (UID: "45153c87-e6ea-4463-9a37-4bc2805530f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.808577 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: I0320 16:19:12.808941 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.809110 4675 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.809161 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:13.809145647 +0000 UTC m=+1073.842775184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "webhook-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.809402 4675 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:19:12 crc kubenswrapper[4675]: E0320 16:19:12.809488 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:13.809462695 +0000 UTC m=+1073.843092282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "metrics-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.060379 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.076788 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.117905 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" event={"ID":"19624364-2dec-43f0-961c-12c5071289fd","Type":"ContainerStarted","Data":"4497d070fd1910d9dc5fab04b550aca5c03861df8b60c3b923a3bf79e8f99919"} Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.119340 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" event={"ID":"56bf5935-2a04-4182-8db7-8b98736a96fa","Type":"ContainerStarted","Data":"b738f9cc9679c437ea4764f52e1d5bb981819a2f58e512712e6b68c013269c37"} Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.130640 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh"] Mar 20 16:19:13 crc kubenswrapper[4675]: W0320 16:19:13.134739 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab24f7d_1842_40a4_8ab1_a0299644ecd5.slice/crio-ed6c31e5f7dc487d0ec62544efea0ed9f349e32672700d1c81a897ae386304b7 WatchSource:0}: Error finding container ed6c31e5f7dc487d0ec62544efea0ed9f349e32672700d1c81a897ae386304b7: Status 404 returned error can't find the container with id ed6c31e5f7dc487d0ec62544efea0ed9f349e32672700d1c81a897ae386304b7 Mar 20 16:19:13 crc kubenswrapper[4675]: W0320 16:19:13.135123 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2c3dde_b80d_4baa_a092_c38d978c7c4e.slice/crio-3820bed5bfbac7d0129a0db553ca7cc7f1566bc99f2cb500f661394b761932cd WatchSource:0}: Error finding container 3820bed5bfbac7d0129a0db553ca7cc7f1566bc99f2cb500f661394b761932cd: Status 404 returned error can't find the container with id 3820bed5bfbac7d0129a0db553ca7cc7f1566bc99f2cb500f661394b761932cd Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.138022 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.147215 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.254508 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-msv57"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.279426 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.290342 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.299519 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.307537 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv"] Mar 20 16:19:13 crc kubenswrapper[4675]: W0320 16:19:13.313375 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f79a1c_7e90_4c87_8027_4044ec669321.slice/crio-c628ab1a11a3fda148e175ec195235344e74c7bcdeefebdacf94e936157d7bb7 WatchSource:0}: Error finding container c628ab1a11a3fda148e175ec195235344e74c7bcdeefebdacf94e936157d7bb7: Status 404 returned error can't find the container with id c628ab1a11a3fda148e175ec195235344e74c7bcdeefebdacf94e936157d7bb7 Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.313427 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.319718 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.321645 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.321872 4675 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.321945 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert podName:597a22d0-7193-41e5-8312-8cc9aa9a29a8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:15.321927901 +0000 UTC m=+1075.355557438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert") pod "infra-operator-controller-manager-8d4c8954d-flsfk" (UID: "597a22d0-7193-41e5-8312-8cc9aa9a29a8") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.325277 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9"] Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.327072 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4khv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-fpwf9_openstack-operators(1c0ae0a7-9969-47b3-871f-536af2bd1784): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.327223 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bt77t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-m6dhl_openstack-operators(26f79a1c-7e90-4c87-8027-4044ec669321): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.327918 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bbspp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-fvx4c_openstack-operators(14e35dfe-49a2-4d89-9404-3ef0311be41e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.328199 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" podUID="1c0ae0a7-9969-47b3-871f-536af2bd1784" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.328526 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" podUID="26f79a1c-7e90-4c87-8027-4044ec669321" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.328986 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" podUID="14e35dfe-49a2-4d89-9404-3ef0311be41e" Mar 20 16:19:13 crc kubenswrapper[4675]: W0320 16:19:13.334004 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34bb9539_fd7b_49da_99dd_548e9e8de389.slice/crio-4afd669266f0b88624d7147077547138ec571a74be0fd6a5f3acef8e676dfd97 WatchSource:0}: Error finding container 4afd669266f0b88624d7147077547138ec571a74be0fd6a5f3acef8e676dfd97: Status 404 returned error can't find the container with id 4afd669266f0b88624d7147077547138ec571a74be0fd6a5f3acef8e676dfd97 Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.334737 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk"] Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.336288 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w6rhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-tg6fk_openstack-operators(34bb9539-fd7b-49da-99dd-548e9e8de389): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.337654 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" podUID="34bb9539-fd7b-49da-99dd-548e9e8de389" Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.339955 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.468190 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9"] Mar 20 16:19:13 crc kubenswrapper[4675]: W0320 16:19:13.471683 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e316228_bcfc_4362_9786_67097a7b0730.slice/crio-450cd3d81275bf1c83f41db3137c4de1f27fc40bbfee7c65801e85f03ec8a870 WatchSource:0}: Error finding container 450cd3d81275bf1c83f41db3137c4de1f27fc40bbfee7c65801e85f03ec8a870: Status 404 returned error can't find the container with id 450cd3d81275bf1c83f41db3137c4de1f27fc40bbfee7c65801e85f03ec8a870 Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.472011 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pznqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-mtkg9_openstack-operators(442371dc-c0af-48c8-83fa-01012b636590): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.473226 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" podUID="442371dc-c0af-48c8-83fa-01012b636590" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.473977 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-drnbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-tphgw_openstack-operators(9e316228-bcfc-4362-9786-67097a7b0730): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.475240 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" podUID="9e316228-bcfc-4362-9786-67097a7b0730" Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.475874 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.486219 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d"] Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.494264 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng"] Mar 20 16:19:13 crc kubenswrapper[4675]: W0320 16:19:13.506719 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc3ed1d_cda9_47fc_b8a8_8c22e7dac6a7.slice/crio-34c21b443694d9dfce192fefdb6e3c7a6e4e31b0ba890e2dc3327e70c78a3890 WatchSource:0}: Error finding container 34c21b443694d9dfce192fefdb6e3c7a6e4e31b0ba890e2dc3327e70c78a3890: Status 404 returned error can't find the container with id 34c21b443694d9dfce192fefdb6e3c7a6e4e31b0ba890e2dc3327e70c78a3890 Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.510485 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpdtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-jh8ng_openstack-operators(1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.511916 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" podUID="1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7" Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.626435 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.626674 4675 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.626752 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert podName:45153c87-e6ea-4463-9a37-4bc2805530f8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:15.626728503 +0000 UTC m=+1075.660358040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" (UID: "45153c87-e6ea-4463-9a37-4bc2805530f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.828752 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:13 crc kubenswrapper[4675]: I0320 16:19:13.828808 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.828902 4675 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.828983 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:15.828963684 +0000 UTC m=+1075.862593291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "metrics-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.829119 4675 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:19:13 crc kubenswrapper[4675]: E0320 16:19:13.829250 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:15.829222961 +0000 UTC m=+1075.862852498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "webhook-server-cert" not found Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.128064 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" event={"ID":"c4f8d16f-3951-48ab-8525-79be59c6d957","Type":"ContainerStarted","Data":"624f56fec36f121fe713a43f3711e6629c8fbbf680a81ac0b9d7a04b3b9a8bb4"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.129978 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" event={"ID":"fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d","Type":"ContainerStarted","Data":"438de61717792e53706fbc9c1fc550254f432e826c4be4d95fbfc01a80b253a9"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.131344 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" event={"ID":"1c0ae0a7-9969-47b3-871f-536af2bd1784","Type":"ContainerStarted","Data":"8beab872e0fc294ce9cc3d830ce80e9e8d43e49529879b27f07fda5a09b25c0b"} Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.133300 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" podUID="1c0ae0a7-9969-47b3-871f-536af2bd1784" Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.133341 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" event={"ID":"14e35dfe-49a2-4d89-9404-3ef0311be41e","Type":"ContainerStarted","Data":"e9042ef8452f08d582e71bfbe6ea8b11d16ad353ae21f1e9062e65a435fb66be"} Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.134935 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" podUID="14e35dfe-49a2-4d89-9404-3ef0311be41e" Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.135830 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" event={"ID":"26f79a1c-7e90-4c87-8027-4044ec669321","Type":"ContainerStarted","Data":"c628ab1a11a3fda148e175ec195235344e74c7bcdeefebdacf94e936157d7bb7"} Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.137058 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" podUID="26f79a1c-7e90-4c87-8027-4044ec669321" Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.137695 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" event={"ID":"9e05e473-0f8c-41a4-8b84-2c0c0d867f90","Type":"ContainerStarted","Data":"deb4d20f255e9e3c29b3d3e4cda71afccbfac7b4530877ce25e27599973d9f36"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.139368 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" event={"ID":"5ce8a4db-9cfd-45da-82be-597b6f3b1257","Type":"ContainerStarted","Data":"e8a4823520c1e1cb5d4fed7bdc30bb45d723ce1a6afb8d09eb994ac62e616b36"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.141530 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" event={"ID":"5718414d-60fc-4af8-a8aa-46a12b8114ab","Type":"ContainerStarted","Data":"7049f7fa5e252da8be494e1208e2a56dedc86737dd8163a0a4291f4b038543c0"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.145981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" event={"ID":"34bb9539-fd7b-49da-99dd-548e9e8de389","Type":"ContainerStarted","Data":"4afd669266f0b88624d7147077547138ec571a74be0fd6a5f3acef8e676dfd97"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.152994 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" event={"ID":"9e316228-bcfc-4362-9786-67097a7b0730","Type":"ContainerStarted","Data":"450cd3d81275bf1c83f41db3137c4de1f27fc40bbfee7c65801e85f03ec8a870"} Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.154105 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" podUID="34bb9539-fd7b-49da-99dd-548e9e8de389" Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.158660 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" podUID="9e316228-bcfc-4362-9786-67097a7b0730" Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.176759 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" event={"ID":"ec2c3dde-b80d-4baa-a092-c38d978c7c4e","Type":"ContainerStarted","Data":"3820bed5bfbac7d0129a0db553ca7cc7f1566bc99f2cb500f661394b761932cd"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.182644 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" event={"ID":"427a4fcc-2562-4dc5-8735-0f6a448533ab","Type":"ContainerStarted","Data":"2df5ede7d5aa8e051a3e37f5a93d3f4c0d8ebb9559613482567200c730b7bfbc"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.186383 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" event={"ID":"442371dc-c0af-48c8-83fa-01012b636590","Type":"ContainerStarted","Data":"6ac48ae96b31ef42f4d074596070694ea02cbbcc1e0f2f3b8c2e11526f90b113"} Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.193384 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" podUID="442371dc-c0af-48c8-83fa-01012b636590" Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.196332 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" event={"ID":"4a01ec4a-d453-44a0-ae13-ef8607f3ccb3","Type":"ContainerStarted","Data":"fcae9b5c6bcc65c014b328e3c63d0594cea3b992f913475dd6a9a6da100e09ba"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.203662 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" event={"ID":"5ab24f7d-1842-40a4-8ab1-a0299644ecd5","Type":"ContainerStarted","Data":"ed6c31e5f7dc487d0ec62544efea0ed9f349e32672700d1c81a897ae386304b7"} Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.209437 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" event={"ID":"1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7","Type":"ContainerStarted","Data":"34c21b443694d9dfce192fefdb6e3c7a6e4e31b0ba890e2dc3327e70c78a3890"} Mar 20 16:19:14 crc kubenswrapper[4675]: E0320 16:19:14.215103 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" podUID="1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7" Mar 20 16:19:14 crc kubenswrapper[4675]: I0320 16:19:14.219941 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" event={"ID":"762b742c-1a1d-4a82-8f60-b71e9fe44637","Type":"ContainerStarted","Data":"52b36ad66d179c508f14a347217d7734fbbe4ac0386560eee86b945b335e44e1"} Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.227576 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" podUID="1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.227713 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" podUID="442371dc-c0af-48c8-83fa-01012b636590" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.227791 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" podUID="9e316228-bcfc-4362-9786-67097a7b0730" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.227817 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" podUID="26f79a1c-7e90-4c87-8027-4044ec669321" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.228149 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" podUID="34bb9539-fd7b-49da-99dd-548e9e8de389" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.228249 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" podUID="14e35dfe-49a2-4d89-9404-3ef0311be41e" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.229074 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" podUID="1c0ae0a7-9969-47b3-871f-536af2bd1784" Mar 20 16:19:15 crc kubenswrapper[4675]: I0320 16:19:15.349190 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.349345 4675 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.349402 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert podName:597a22d0-7193-41e5-8312-8cc9aa9a29a8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:19.349385793 +0000 UTC m=+1079.383015330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert") pod "infra-operator-controller-manager-8d4c8954d-flsfk" (UID: "597a22d0-7193-41e5-8312-8cc9aa9a29a8") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: I0320 16:19:15.655302 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.655614 4675 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.655688 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert podName:45153c87-e6ea-4463-9a37-4bc2805530f8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:19.655657926 +0000 UTC m=+1079.689287463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" (UID: "45153c87-e6ea-4463-9a37-4bc2805530f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: I0320 16:19:15.857235 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:15 crc kubenswrapper[4675]: I0320 16:19:15.857293 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.857480 4675 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.857537 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:19.857519597 +0000 UTC m=+1079.891149134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "webhook-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.857961 4675 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:19:15 crc kubenswrapper[4675]: E0320 16:19:15.858002 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:19.85798852 +0000 UTC m=+1079.891618057 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "metrics-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: I0320 16:19:19.404877 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.405028 4675 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.405373 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert podName:597a22d0-7193-41e5-8312-8cc9aa9a29a8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:27.405356999 +0000 UTC m=+1087.438986536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert") pod "infra-operator-controller-manager-8d4c8954d-flsfk" (UID: "597a22d0-7193-41e5-8312-8cc9aa9a29a8") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: I0320 16:19:19.709361 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.709709 4675 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.709836 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert podName:45153c87-e6ea-4463-9a37-4bc2805530f8 nodeName:}" failed. No retries permitted until 2026-03-20 16:19:27.709816891 +0000 UTC m=+1087.743446428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" (UID: "45153c87-e6ea-4463-9a37-4bc2805530f8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: I0320 16:19:19.913023 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.913235 4675 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: I0320 16:19:19.913263 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.913311 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:27.913293657 +0000 UTC m=+1087.946923194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "webhook-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.913342 4675 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:19:19 crc kubenswrapper[4675]: E0320 16:19:19.913388 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:27.913376679 +0000 UTC m=+1087.947006216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "metrics-server-cert" not found Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.278108 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" event={"ID":"5718414d-60fc-4af8-a8aa-46a12b8114ab","Type":"ContainerStarted","Data":"871e55c3c168c7736633ee48d60a39fe3227d15baf3bada834c81a3e9bf0e626"} Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.278711 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.288161 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" event={"ID":"fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d","Type":"ContainerStarted","Data":"f1b7178bf043c108f2e46068092be44e78c9125b64e2ce651110bdf455231ce4"} Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.288899 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.292850 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" event={"ID":"56bf5935-2a04-4182-8db7-8b98736a96fa","Type":"ContainerStarted","Data":"61166e4013b97d019d64f6efcd205a88ba96437df7c358639c3a030713d0f37b"} Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.293560 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.295941 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" event={"ID":"5ab24f7d-1842-40a4-8ab1-a0299644ecd5","Type":"ContainerStarted","Data":"3338b9a9152684e6c6d1b1fbd3c5a9ecd7c9c08a522183d4c0bff1afd6d36401"} Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.296343 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.302367 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" podStartSLOduration=2.717568916 podStartE2EDuration="11.302343211s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.32618026 +0000 UTC m=+1073.359809797" lastFinishedPulling="2026-03-20 16:19:21.910954555 +0000 UTC m=+1081.944584092" observedRunningTime="2026-03-20 16:19:22.295944722 +0000 UTC m=+1082.329574259" watchObservedRunningTime="2026-03-20 16:19:22.302343211 +0000 UTC m=+1082.335972748" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.311034 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.324625 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" podStartSLOduration=2.5362401390000002 podStartE2EDuration="11.324604364s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.076449629 +0000 UTC m=+1073.110079166" lastFinishedPulling="2026-03-20 16:19:21.864813864 +0000 UTC m=+1081.898443391" observedRunningTime="2026-03-20 16:19:22.31657657 +0000 UTC m=+1082.350206107" watchObservedRunningTime="2026-03-20 16:19:22.324604364 +0000 UTC m=+1082.358233901" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.342198 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" podStartSLOduration=2.621778044 podStartE2EDuration="11.342180706s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.14434259 +0000 UTC m=+1073.177972127" lastFinishedPulling="2026-03-20 16:19:21.864745262 +0000 UTC m=+1081.898374789" observedRunningTime="2026-03-20 16:19:22.339210183 +0000 UTC m=+1082.372839720" watchObservedRunningTime="2026-03-20 16:19:22.342180706 +0000 UTC m=+1082.375810243" Mar 20 16:19:22 crc kubenswrapper[4675]: I0320 16:19:22.371525 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" podStartSLOduration=2.637976308 podStartE2EDuration="11.371503267s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.137056316 +0000 UTC m=+1073.170685853" lastFinishedPulling="2026-03-20 16:19:21.870583275 +0000 UTC m=+1081.904212812" observedRunningTime="2026-03-20 16:19:22.364015508 +0000 UTC m=+1082.397645045" watchObservedRunningTime="2026-03-20 16:19:22.371503267 +0000 UTC m=+1082.405132804" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.316737 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" event={"ID":"19624364-2dec-43f0-961c-12c5071289fd","Type":"ContainerStarted","Data":"b7ca5d4241f588f3ece7c170439205cdff5659eb8c674a3647b3b0a16d2427dc"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.317496 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.318933 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" event={"ID":"762b742c-1a1d-4a82-8f60-b71e9fe44637","Type":"ContainerStarted","Data":"5ff51a617f35eb3d86069cade531fdec3e4f6f176ba420245f91664dd7328aca"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.319288 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.320213 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" event={"ID":"9e05e473-0f8c-41a4-8b84-2c0c0d867f90","Type":"ContainerStarted","Data":"71997a61f4f4cca2040cb25a142d77ff2e88e42e2911b729538b60d7c92c7da5"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.320559 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.322058 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" event={"ID":"c4f8d16f-3951-48ab-8525-79be59c6d957","Type":"ContainerStarted","Data":"3c3744a02b9a6f0c6c95c0ab4461706325ba2f07a96acce7774dabbca7b1c403"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.322122 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.323335 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" event={"ID":"427a4fcc-2562-4dc5-8735-0f6a448533ab","Type":"ContainerStarted","Data":"7cd44eb396fb817250e380599f327e54eb797913be6b21079d2de4f34a50d90a"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.323478 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.324733 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" event={"ID":"4a01ec4a-d453-44a0-ae13-ef8607f3ccb3","Type":"ContainerStarted","Data":"95d5f8583d19a4ab75550d60052359081738cf7b4d53631efb4f4058e02ee090"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.324812 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.326281 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" event={"ID":"5ce8a4db-9cfd-45da-82be-597b6f3b1257","Type":"ContainerStarted","Data":"67f33e0060683c224a698000ce898d74440d4597239b50a9b6f498ec8b64c768"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.326602 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.329564 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" event={"ID":"ec2c3dde-b80d-4baa-a092-c38d978c7c4e","Type":"ContainerStarted","Data":"11732d7b956fe3115b5a0ce193bfbcfb88ab2a8b371d7124eb1ec6334333c865"} Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.334268 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" podStartSLOduration=3.555174792 podStartE2EDuration="12.334257267s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.143038593 +0000 UTC m=+1073.176668130" lastFinishedPulling="2026-03-20 16:19:21.922121058 +0000 UTC m=+1081.955750605" observedRunningTime="2026-03-20 16:19:22.390078407 +0000 UTC m=+1082.423707944" watchObservedRunningTime="2026-03-20 16:19:23.334257267 +0000 UTC m=+1083.367886804" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.335155 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" podStartSLOduration=3.509668908 podStartE2EDuration="12.335148742s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.097149938 +0000 UTC m=+1073.130779475" lastFinishedPulling="2026-03-20 16:19:21.922629762 +0000 UTC m=+1081.956259309" observedRunningTime="2026-03-20 16:19:23.331592982 +0000 UTC m=+1083.365222529" watchObservedRunningTime="2026-03-20 16:19:23.335148742 +0000 UTC m=+1083.368778279" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.351702 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" podStartSLOduration=3.753877005 podStartE2EDuration="12.351681785s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.326796977 +0000 UTC m=+1073.360426514" lastFinishedPulling="2026-03-20 16:19:21.924601757 +0000 UTC m=+1081.958231294" observedRunningTime="2026-03-20 16:19:23.346133749 +0000 UTC m=+1083.379763286" watchObservedRunningTime="2026-03-20 16:19:23.351681785 +0000 UTC m=+1083.385311322" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.367095 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" podStartSLOduration=3.717969469 podStartE2EDuration="12.367077086s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.273046422 +0000 UTC m=+1073.306675969" lastFinishedPulling="2026-03-20 16:19:21.922154049 +0000 UTC m=+1081.955783586" observedRunningTime="2026-03-20 16:19:23.359926725 +0000 UTC m=+1083.393556272" watchObservedRunningTime="2026-03-20 16:19:23.367077086 +0000 UTC m=+1083.400706623" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.394024 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" podStartSLOduration=3.950387105 podStartE2EDuration="12.394007979s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.49337607 +0000 UTC m=+1073.527005607" lastFinishedPulling="2026-03-20 16:19:21.936996944 +0000 UTC m=+1081.970626481" observedRunningTime="2026-03-20 16:19:23.389129813 +0000 UTC m=+1083.422759350" watchObservedRunningTime="2026-03-20 16:19:23.394007979 +0000 UTC m=+1083.427637516" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.428042 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" podStartSLOduration=3.832006281 podStartE2EDuration="12.428023552s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.326812787 +0000 UTC m=+1073.360442324" lastFinishedPulling="2026-03-20 16:19:21.922830058 +0000 UTC m=+1081.956459595" observedRunningTime="2026-03-20 16:19:23.412045604 +0000 UTC m=+1083.445675151" watchObservedRunningTime="2026-03-20 16:19:23.428023552 +0000 UTC m=+1083.461653089" Mar 20 16:19:23 crc kubenswrapper[4675]: I0320 16:19:23.429888 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" podStartSLOduration=3.796261471 podStartE2EDuration="12.429881244s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.288541766 +0000 UTC m=+1073.322171303" lastFinishedPulling="2026-03-20 16:19:21.922161539 +0000 UTC m=+1081.955791076" observedRunningTime="2026-03-20 16:19:23.424502223 +0000 UTC m=+1083.458131760" watchObservedRunningTime="2026-03-20 16:19:23.429881244 +0000 UTC m=+1083.463510781" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.418253 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.424459 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/597a22d0-7193-41e5-8312-8cc9aa9a29a8-cert\") pod \"infra-operator-controller-manager-8d4c8954d-flsfk\" (UID: \"597a22d0-7193-41e5-8312-8cc9aa9a29a8\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.487270 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t4md6" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.495531 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.724604 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.729320 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45153c87-e6ea-4463-9a37-4bc2805530f8-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5nv6w9\" (UID: \"45153c87-e6ea-4463-9a37-4bc2805530f8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.849423 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cf9gq" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.857960 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.929013 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.929066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:27 crc kubenswrapper[4675]: E0320 16:19:27.929215 4675 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:19:27 crc kubenswrapper[4675]: E0320 16:19:27.929269 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs podName:74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b nodeName:}" failed. No retries permitted until 2026-03-20 16:19:43.92925248 +0000 UTC m=+1103.962882027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-skxzd" (UID: "74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b") : secret "webhook-server-cert" not found Mar 20 16:19:27 crc kubenswrapper[4675]: I0320 16:19:27.935223 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.314161 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" podStartSLOduration=9.717402786 podStartE2EDuration="18.314141796s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.326620682 +0000 UTC m=+1073.360250219" lastFinishedPulling="2026-03-20 16:19:21.923359692 +0000 UTC m=+1081.956989229" observedRunningTime="2026-03-20 16:19:23.445891262 +0000 UTC m=+1083.479520799" watchObservedRunningTime="2026-03-20 16:19:29.314141796 +0000 UTC m=+1089.347771333" Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.321259 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk"] Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.375951 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9"] Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.390983 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" event={"ID":"1c0ae0a7-9969-47b3-871f-536af2bd1784","Type":"ContainerStarted","Data":"5ec43279e587dbeee44cf629bfaa3bcaa309abcb6d8e5f8103864cffda6331e7"} Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.392302 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.394648 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" event={"ID":"34bb9539-fd7b-49da-99dd-548e9e8de389","Type":"ContainerStarted","Data":"145843288d10de80c96da5286fa214d5fbd61c0414e675df0ec6cd6f1faf3722"} Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.395168 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.411613 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" podStartSLOduration=2.837567355 podStartE2EDuration="18.411590064s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.326937381 +0000 UTC m=+1073.360566918" lastFinishedPulling="2026-03-20 16:19:28.90096009 +0000 UTC m=+1088.934589627" observedRunningTime="2026-03-20 16:19:29.406157732 +0000 UTC m=+1089.439787269" watchObservedRunningTime="2026-03-20 16:19:29.411590064 +0000 UTC m=+1089.445219601" Mar 20 16:19:29 crc kubenswrapper[4675]: I0320 16:19:29.426940 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" podStartSLOduration=2.887918135 podStartE2EDuration="18.426921823s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.3361826 +0000 UTC m=+1073.369812137" lastFinishedPulling="2026-03-20 16:19:28.875186278 +0000 UTC m=+1088.908815825" observedRunningTime="2026-03-20 16:19:29.424166166 +0000 UTC m=+1089.457795723" watchObservedRunningTime="2026-03-20 16:19:29.426921823 +0000 UTC m=+1089.460551360" Mar 20 16:19:29 crc kubenswrapper[4675]: W0320 16:19:29.941211 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod597a22d0_7193_41e5_8312_8cc9aa9a29a8.slice/crio-3ad62189b33f2b3d8bd500d02d1a1fee2f18f7173b71349d3c48368bbbb3bd95 WatchSource:0}: Error finding container 3ad62189b33f2b3d8bd500d02d1a1fee2f18f7173b71349d3c48368bbbb3bd95: Status 404 returned error can't find the container with id 3ad62189b33f2b3d8bd500d02d1a1fee2f18f7173b71349d3c48368bbbb3bd95 Mar 20 16:19:30 crc kubenswrapper[4675]: I0320 16:19:30.403328 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" event={"ID":"45153c87-e6ea-4463-9a37-4bc2805530f8","Type":"ContainerStarted","Data":"e2dae1ee4469ab053a07da1e1a47a50c3bdabf67500d29f702f823d1b005e697"} Mar 20 16:19:30 crc kubenswrapper[4675]: I0320 16:19:30.405285 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" event={"ID":"597a22d0-7193-41e5-8312-8cc9aa9a29a8","Type":"ContainerStarted","Data":"3ad62189b33f2b3d8bd500d02d1a1fee2f18f7173b71349d3c48368bbbb3bd95"} Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.413626 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" event={"ID":"9e316228-bcfc-4362-9786-67097a7b0730","Type":"ContainerStarted","Data":"89519f4b6520355507da3207ba5f3e2d6d214222aa6d9599ab77e2d03d58e4ff"} Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.414133 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.423863 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" event={"ID":"1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7","Type":"ContainerStarted","Data":"09cd24aa7b15f66fcb83ba25f439479e0f395cfb0cac77376a824acb7b2b64f2"} Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.424368 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.426134 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" event={"ID":"442371dc-c0af-48c8-83fa-01012b636590","Type":"ContainerStarted","Data":"7c68484479e97d163cd170c5bb720cea452f09e2522c9c27914e447beee1eca4"} Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.426898 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.433674 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" podStartSLOduration=3.931526366 podStartE2EDuration="20.433657686s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.473819922 +0000 UTC m=+1073.507449459" lastFinishedPulling="2026-03-20 16:19:29.975951242 +0000 UTC m=+1090.009580779" observedRunningTime="2026-03-20 16:19:31.4277267 +0000 UTC m=+1091.461356237" watchObservedRunningTime="2026-03-20 16:19:31.433657686 +0000 UTC m=+1091.467287223" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.452004 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" podStartSLOduration=2.742207605 podStartE2EDuration="20.451969529s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.471878888 +0000 UTC m=+1073.505508425" lastFinishedPulling="2026-03-20 16:19:31.181640812 +0000 UTC m=+1091.215270349" observedRunningTime="2026-03-20 16:19:31.446131145 +0000 UTC m=+1091.479760692" watchObservedRunningTime="2026-03-20 16:19:31.451969529 +0000 UTC m=+1091.485599066" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.464190 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" podStartSLOduration=2.791607699 podStartE2EDuration="20.464173931s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.510385416 +0000 UTC m=+1073.544014953" lastFinishedPulling="2026-03-20 16:19:31.182951648 +0000 UTC m=+1091.216581185" observedRunningTime="2026-03-20 16:19:31.46272168 +0000 UTC m=+1091.496351257" watchObservedRunningTime="2026-03-20 16:19:31.464173931 +0000 UTC m=+1091.497803468" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.736264 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c6j4l" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.786963 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jrphq" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.801877 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ktbv8" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.838513 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-k6lwh" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.887167 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vrwzl" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.914086 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-gvrbv" Mar 20 16:19:31 crc kubenswrapper[4675]: I0320 16:19:31.997843 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-q4k6l" Mar 20 16:19:32 crc kubenswrapper[4675]: I0320 16:19:32.072014 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-msv57" Mar 20 16:19:32 crc kubenswrapper[4675]: I0320 16:19:32.477293 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-bbdh2" Mar 20 16:19:32 crc kubenswrapper[4675]: I0320 16:19:32.477882 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bk29d" Mar 20 16:19:32 crc kubenswrapper[4675]: I0320 16:19:32.480990 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hrp5b" Mar 20 16:19:32 crc kubenswrapper[4675]: I0320 16:19:32.481018 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-zngbt" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.508534 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" event={"ID":"45153c87-e6ea-4463-9a37-4bc2805530f8","Type":"ContainerStarted","Data":"4a54341c11a3a125e4ff9626acf51c3eacc580a68d7dc4de0139a26c3d23ee41"} Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.509043 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.510464 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" event={"ID":"597a22d0-7193-41e5-8312-8cc9aa9a29a8","Type":"ContainerStarted","Data":"3eed9794356dd3a97fcda953d37b6a241ed4a6e415eef6fea59472d66796238a"} Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.510569 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.511933 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" event={"ID":"14e35dfe-49a2-4d89-9404-3ef0311be41e","Type":"ContainerStarted","Data":"c60d12ae6463d99b0de87f25933a8be7463d567829cd05546e45978fa5a00e4a"} Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.512378 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.513392 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" event={"ID":"26f79a1c-7e90-4c87-8027-4044ec669321","Type":"ContainerStarted","Data":"9c9e4da4e33f067f2562610733430d7f65d2b9eca82da783d86cbd567d32a308"} Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.513591 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.536767 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" podStartSLOduration=19.848119425 podStartE2EDuration="24.5367487s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:29.970310464 +0000 UTC m=+1090.003940001" lastFinishedPulling="2026-03-20 16:19:34.658939739 +0000 UTC m=+1094.692569276" observedRunningTime="2026-03-20 16:19:35.535336241 +0000 UTC m=+1095.568965788" watchObservedRunningTime="2026-03-20 16:19:35.5367487 +0000 UTC m=+1095.570378237" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.559807 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" podStartSLOduration=19.850556212 podStartE2EDuration="24.559791085s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:29.970305933 +0000 UTC m=+1090.003935470" lastFinishedPulling="2026-03-20 16:19:34.679540806 +0000 UTC m=+1094.713170343" observedRunningTime="2026-03-20 16:19:35.552923963 +0000 UTC m=+1095.586553500" watchObservedRunningTime="2026-03-20 16:19:35.559791085 +0000 UTC m=+1095.593420622" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.577957 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" podStartSLOduration=3.253175967 podStartE2EDuration="24.577941703s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.327808785 +0000 UTC m=+1073.361438322" lastFinishedPulling="2026-03-20 16:19:34.652574501 +0000 UTC m=+1094.686204058" observedRunningTime="2026-03-20 16:19:35.573391096 +0000 UTC m=+1095.607020633" watchObservedRunningTime="2026-03-20 16:19:35.577941703 +0000 UTC m=+1095.611571240" Mar 20 16:19:35 crc kubenswrapper[4675]: I0320 16:19:35.600228 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" podStartSLOduration=3.272343065 podStartE2EDuration="24.600208627s" podCreationTimestamp="2026-03-20 16:19:11 +0000 UTC" firstStartedPulling="2026-03-20 16:19:13.327004093 +0000 UTC m=+1073.360633630" lastFinishedPulling="2026-03-20 16:19:34.654869655 +0000 UTC m=+1094.688499192" observedRunningTime="2026-03-20 16:19:35.594988391 +0000 UTC m=+1095.628617928" watchObservedRunningTime="2026-03-20 16:19:35.600208627 +0000 UTC m=+1095.633838154" Mar 20 16:19:41 crc kubenswrapper[4675]: I0320 16:19:41.855633 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m6dhl" Mar 20 16:19:41 crc kubenswrapper[4675]: I0320 16:19:41.932829 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpwf9" Mar 20 16:19:41 crc kubenswrapper[4675]: I0320 16:19:41.957743 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tg6fk" Mar 20 16:19:42 crc kubenswrapper[4675]: I0320 16:19:42.174904 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-fvx4c" Mar 20 16:19:42 crc kubenswrapper[4675]: I0320 16:19:42.260620 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-mtkg9" Mar 20 16:19:42 crc kubenswrapper[4675]: I0320 16:19:42.295589 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jh8ng" Mar 20 16:19:42 crc kubenswrapper[4675]: I0320 16:19:42.371501 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-tphgw" Mar 20 16:19:43 crc kubenswrapper[4675]: I0320 16:19:43.938093 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:43 crc kubenswrapper[4675]: I0320 16:19:43.946732 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-skxzd\" (UID: \"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:44 crc kubenswrapper[4675]: I0320 16:19:44.235268 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-c57kx" Mar 20 16:19:44 crc kubenswrapper[4675]: I0320 16:19:44.243149 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:44 crc kubenswrapper[4675]: I0320 16:19:44.460145 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd"] Mar 20 16:19:44 crc kubenswrapper[4675]: I0320 16:19:44.575953 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" event={"ID":"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b","Type":"ContainerStarted","Data":"ac3546a8b4c65dcccc2374d35d7a53dfd0f409ce343da9d9de358d700be8f503"} Mar 20 16:19:45 crc kubenswrapper[4675]: I0320 16:19:45.586836 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" event={"ID":"74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b","Type":"ContainerStarted","Data":"e83cb403c80bef823a0a124c83b2ea7aa71c23560bba7a54f9c43b348a2eaec2"} Mar 20 16:19:46 crc kubenswrapper[4675]: I0320 16:19:46.596357 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:19:46 crc kubenswrapper[4675]: I0320 16:19:46.634190 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" podStartSLOduration=34.634171336 podStartE2EDuration="34.634171336s" podCreationTimestamp="2026-03-20 16:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:19:46.628376221 +0000 UTC m=+1106.662005798" watchObservedRunningTime="2026-03-20 16:19:46.634171336 +0000 UTC m=+1106.667800893" Mar 20 16:19:47 crc kubenswrapper[4675]: I0320 16:19:47.515269 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-flsfk" Mar 20 16:19:47 crc kubenswrapper[4675]: I0320 16:19:47.865338 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5nv6w9" Mar 20 16:19:54 crc kubenswrapper[4675]: I0320 16:19:54.251060 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-skxzd" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.150591 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567060-sn7kc"] Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.152128 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.154174 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.154479 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.154383 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.166691 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-sn7kc"] Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.255684 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2szz\" (UniqueName: \"kubernetes.io/projected/709c8047-9593-4375-aaae-b982f574e0c0-kube-api-access-h2szz\") pod \"auto-csr-approver-29567060-sn7kc\" (UID: \"709c8047-9593-4375-aaae-b982f574e0c0\") " pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.357515 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2szz\" (UniqueName: \"kubernetes.io/projected/709c8047-9593-4375-aaae-b982f574e0c0-kube-api-access-h2szz\") pod \"auto-csr-approver-29567060-sn7kc\" (UID: \"709c8047-9593-4375-aaae-b982f574e0c0\") " pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.377167 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2szz\" (UniqueName: \"kubernetes.io/projected/709c8047-9593-4375-aaae-b982f574e0c0-kube-api-access-h2szz\") pod \"auto-csr-approver-29567060-sn7kc\" (UID: \"709c8047-9593-4375-aaae-b982f574e0c0\") " pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.475788 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:00 crc kubenswrapper[4675]: I0320 16:20:00.928615 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-sn7kc"] Mar 20 16:20:01 crc kubenswrapper[4675]: I0320 16:20:01.708326 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" event={"ID":"709c8047-9593-4375-aaae-b982f574e0c0","Type":"ContainerStarted","Data":"35d2a1ea57d175761927b530b26f545cca9bcbcffa3c1ecb9645bf6b486f754c"} Mar 20 16:20:03 crc kubenswrapper[4675]: I0320 16:20:03.721818 4675 generic.go:334] "Generic (PLEG): container finished" podID="709c8047-9593-4375-aaae-b982f574e0c0" containerID="b787edcb44bf9ee2b1470910ae1cafe7fbe75892d4dcdd8e91be284e1a07d49f" exitCode=0 Mar 20 16:20:03 crc kubenswrapper[4675]: I0320 16:20:03.721893 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" event={"ID":"709c8047-9593-4375-aaae-b982f574e0c0","Type":"ContainerDied","Data":"b787edcb44bf9ee2b1470910ae1cafe7fbe75892d4dcdd8e91be284e1a07d49f"} Mar 20 16:20:04 crc kubenswrapper[4675]: I0320 16:20:04.425076 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:20:04 crc kubenswrapper[4675]: I0320 16:20:04.425412 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.154860 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.332079 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2szz\" (UniqueName: \"kubernetes.io/projected/709c8047-9593-4375-aaae-b982f574e0c0-kube-api-access-h2szz\") pod \"709c8047-9593-4375-aaae-b982f574e0c0\" (UID: \"709c8047-9593-4375-aaae-b982f574e0c0\") " Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.337347 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709c8047-9593-4375-aaae-b982f574e0c0-kube-api-access-h2szz" (OuterVolumeSpecName: "kube-api-access-h2szz") pod "709c8047-9593-4375-aaae-b982f574e0c0" (UID: "709c8047-9593-4375-aaae-b982f574e0c0"). InnerVolumeSpecName "kube-api-access-h2szz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.434132 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2szz\" (UniqueName: \"kubernetes.io/projected/709c8047-9593-4375-aaae-b982f574e0c0-kube-api-access-h2szz\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.738727 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" event={"ID":"709c8047-9593-4375-aaae-b982f574e0c0","Type":"ContainerDied","Data":"35d2a1ea57d175761927b530b26f545cca9bcbcffa3c1ecb9645bf6b486f754c"} Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.738803 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d2a1ea57d175761927b530b26f545cca9bcbcffa3c1ecb9645bf6b486f754c" Mar 20 16:20:05 crc kubenswrapper[4675]: I0320 16:20:05.738903 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-sn7kc" Mar 20 16:20:06 crc kubenswrapper[4675]: I0320 16:20:06.234440 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-vr2qs"] Mar 20 16:20:06 crc kubenswrapper[4675]: I0320 16:20:06.239741 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-vr2qs"] Mar 20 16:20:06 crc kubenswrapper[4675]: I0320 16:20:06.683794 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb90f724-cdb5-440c-885c-2933b81bd258" path="/var/lib/kubelet/pods/bb90f724-cdb5-440c-885c-2933b81bd258/volumes" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.556323 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m74dd"] Mar 20 16:20:11 crc kubenswrapper[4675]: E0320 16:20:11.557003 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709c8047-9593-4375-aaae-b982f574e0c0" containerName="oc" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.557018 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="709c8047-9593-4375-aaae-b982f574e0c0" containerName="oc" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.557207 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="709c8047-9593-4375-aaae-b982f574e0c0" containerName="oc" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.558061 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.562846 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.563105 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.566319 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.566412 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-59l87" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.602292 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m74dd"] Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.716341 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5lrzg"] Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.717486 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.719844 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.726088 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-config\") pod \"dnsmasq-dns-675f4bcbfc-m74dd\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.726137 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2knp\" (UniqueName: \"kubernetes.io/projected/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-kube-api-access-b2knp\") pod \"dnsmasq-dns-675f4bcbfc-m74dd\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.728849 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5lrzg"] Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.827043 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-config\") pod \"dnsmasq-dns-675f4bcbfc-m74dd\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.827097 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2knp\" (UniqueName: \"kubernetes.io/projected/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-kube-api-access-b2knp\") pod \"dnsmasq-dns-675f4bcbfc-m74dd\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.827133 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.827156 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w4m8\" (UniqueName: \"kubernetes.io/projected/b590f984-fe7e-4e88-a997-12ea0653abc2-kube-api-access-7w4m8\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.827208 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-config\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.828007 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-config\") pod \"dnsmasq-dns-675f4bcbfc-m74dd\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.848157 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2knp\" (UniqueName: \"kubernetes.io/projected/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-kube-api-access-b2knp\") pod \"dnsmasq-dns-675f4bcbfc-m74dd\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.880328 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.927932 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.927990 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w4m8\" (UniqueName: \"kubernetes.io/projected/b590f984-fe7e-4e88-a997-12ea0653abc2-kube-api-access-7w4m8\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.928028 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-config\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.928756 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.929108 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-config\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:11 crc kubenswrapper[4675]: I0320 16:20:11.943620 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w4m8\" (UniqueName: \"kubernetes.io/projected/b590f984-fe7e-4e88-a997-12ea0653abc2-kube-api-access-7w4m8\") pod \"dnsmasq-dns-78dd6ddcc-5lrzg\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:12 crc kubenswrapper[4675]: I0320 16:20:12.034471 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:12 crc kubenswrapper[4675]: I0320 16:20:12.273496 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m74dd"] Mar 20 16:20:12 crc kubenswrapper[4675]: I0320 16:20:12.426912 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5lrzg"] Mar 20 16:20:12 crc kubenswrapper[4675]: I0320 16:20:12.792157 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" event={"ID":"e0f3d2f9-afe2-4551-8bac-25b7a220ecca","Type":"ContainerStarted","Data":"08811964d78f6273bea97110fb5659ad367cd04043fb7cdb1b2944e0e038f580"} Mar 20 16:20:12 crc kubenswrapper[4675]: I0320 16:20:12.793198 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" event={"ID":"b590f984-fe7e-4e88-a997-12ea0653abc2","Type":"ContainerStarted","Data":"8288a460322f703c4ccf04bc289e4ca8645e9bdc6b2b5cd0994deedd8c991d98"} Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.235917 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m74dd"] Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.258385 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-t7w6q"] Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.271526 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.288215 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-t7w6q"] Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.464922 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-config\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.464992 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sl5z\" (UniqueName: \"kubernetes.io/projected/f35cdc3f-cbcf-46b5-8988-a077a4e65284-kube-api-access-2sl5z\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.465041 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-dns-svc\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.554958 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5lrzg"] Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.567173 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-config\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.567263 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sl5z\" (UniqueName: \"kubernetes.io/projected/f35cdc3f-cbcf-46b5-8988-a077a4e65284-kube-api-access-2sl5z\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.567310 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-dns-svc\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.568376 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-config\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.569429 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-dns-svc\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.582475 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ptlpx"] Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.583584 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.598234 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sl5z\" (UniqueName: \"kubernetes.io/projected/f35cdc3f-cbcf-46b5-8988-a077a4e65284-kube-api-access-2sl5z\") pod \"dnsmasq-dns-666b6646f7-t7w6q\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.599219 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ptlpx"] Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.601510 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.770101 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2cj\" (UniqueName: \"kubernetes.io/projected/b91e693b-6453-443b-8211-6a29b07c91bd-kube-api-access-zb2cj\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.770183 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-config\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.770258 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.871477 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2cj\" (UniqueName: \"kubernetes.io/projected/b91e693b-6453-443b-8211-6a29b07c91bd-kube-api-access-zb2cj\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.871617 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-config\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.871745 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.872903 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.873654 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-config\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.894728 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2cj\" (UniqueName: \"kubernetes.io/projected/b91e693b-6453-443b-8211-6a29b07c91bd-kube-api-access-zb2cj\") pod \"dnsmasq-dns-57d769cc4f-ptlpx\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:14 crc kubenswrapper[4675]: I0320 16:20:14.937459 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.133931 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-t7w6q"] Mar 20 16:20:15 crc kubenswrapper[4675]: W0320 16:20:15.142860 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf35cdc3f_cbcf_46b5_8988_a077a4e65284.slice/crio-0d60f7b72d9e28a516e5a7196ff480147dba0c8f6e9f74acd9ef0755f1a29e29 WatchSource:0}: Error finding container 0d60f7b72d9e28a516e5a7196ff480147dba0c8f6e9f74acd9ef0755f1a29e29: Status 404 returned error can't find the container with id 0d60f7b72d9e28a516e5a7196ff480147dba0c8f6e9f74acd9ef0755f1a29e29 Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.227968 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.229309 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.233263 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.233473 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.233662 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.233704 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.233726 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.233841 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.235453 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bl6tb" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.240837 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.372642 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ptlpx"] Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.378887 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87f2f4be-70c8-409a-8fe8-c753758021f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.378928 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87f2f4be-70c8-409a-8fe8-c753758021f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.378960 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.378985 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379097 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379137 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379174 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc25j\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-kube-api-access-dc25j\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379239 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379275 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379298 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.379325 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480354 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480422 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480451 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480484 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480521 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87f2f4be-70c8-409a-8fe8-c753758021f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480546 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87f2f4be-70c8-409a-8fe8-c753758021f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480569 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480594 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480618 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480649 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.480684 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc25j\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-kube-api-access-dc25j\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.481121 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.481331 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.481409 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.481428 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.481880 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.482886 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87f2f4be-70c8-409a-8fe8-c753758021f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.485278 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.485735 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87f2f4be-70c8-409a-8fe8-c753758021f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.486736 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.489110 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87f2f4be-70c8-409a-8fe8-c753758021f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.497657 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc25j\" (UniqueName: \"kubernetes.io/projected/87f2f4be-70c8-409a-8fe8-c753758021f4-kube-api-access-dc25j\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.500451 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"87f2f4be-70c8-409a-8fe8-c753758021f4\") " pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.528921 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.530548 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.536315 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.536387 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.536918 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.537125 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.537314 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t7s7j" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.537511 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.543653 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.543762 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.558120 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683577 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683615 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683645 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683671 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683690 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2786789-8885-42c4-9127-c0466e2212eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683711 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683730 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6lgh\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-kube-api-access-n6lgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683754 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683790 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2786789-8885-42c4-9127-c0466e2212eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683825 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.683847 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785107 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785491 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2786789-8885-42c4-9127-c0466e2212eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785544 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785573 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6lgh\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-kube-api-access-n6lgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785621 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785676 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2786789-8885-42c4-9127-c0466e2212eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785797 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785846 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785879 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.785902 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.787490 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.788791 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.792989 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.794844 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.795164 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.795264 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2786789-8885-42c4-9127-c0466e2212eb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.795355 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.796628 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.807854 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.811489 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6lgh\" (UniqueName: \"kubernetes.io/projected/f2786789-8885-42c4-9127-c0466e2212eb-kube-api-access-n6lgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.811759 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2786789-8885-42c4-9127-c0466e2212eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.829069 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2786789-8885-42c4-9127-c0466e2212eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.838055 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" event={"ID":"f35cdc3f-cbcf-46b5-8988-a077a4e65284","Type":"ContainerStarted","Data":"0d60f7b72d9e28a516e5a7196ff480147dba0c8f6e9f74acd9ef0755f1a29e29"} Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.849567 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2786789-8885-42c4-9127-c0466e2212eb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:15 crc kubenswrapper[4675]: I0320 16:20:15.855858 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.904589 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.905804 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.911693 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.911896 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.912800 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.916117 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qkpz6" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.917603 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 16:20:16 crc kubenswrapper[4675]: I0320 16:20:16.921207 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004496 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004543 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-operator-scripts\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004562 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/acbf924d-7363-4489-a64a-51c2949a2a69-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004736 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acbf924d-7363-4489-a64a-51c2949a2a69-config-data-generated\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004817 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vp6g\" (UniqueName: \"kubernetes.io/projected/acbf924d-7363-4489-a64a-51c2949a2a69-kube-api-access-6vp6g\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004939 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-kolla-config\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.004990 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-config-data-default\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.005021 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf924d-7363-4489-a64a-51c2949a2a69-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106299 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acbf924d-7363-4489-a64a-51c2949a2a69-config-data-generated\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106348 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vp6g\" (UniqueName: \"kubernetes.io/projected/acbf924d-7363-4489-a64a-51c2949a2a69-kube-api-access-6vp6g\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106378 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-kolla-config\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106398 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-config-data-default\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106419 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf924d-7363-4489-a64a-51c2949a2a69-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106463 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106489 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-operator-scripts\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106513 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/acbf924d-7363-4489-a64a-51c2949a2a69-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.106963 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acbf924d-7363-4489-a64a-51c2949a2a69-config-data-generated\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.107314 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-kolla-config\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.107333 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.107587 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-config-data-default\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.108991 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acbf924d-7363-4489-a64a-51c2949a2a69-operator-scripts\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.111035 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/acbf924d-7363-4489-a64a-51c2949a2a69-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.111628 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbf924d-7363-4489-a64a-51c2949a2a69-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.128925 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vp6g\" (UniqueName: \"kubernetes.io/projected/acbf924d-7363-4489-a64a-51c2949a2a69-kube-api-access-6vp6g\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.164929 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"acbf924d-7363-4489-a64a-51c2949a2a69\") " pod="openstack/openstack-galera-0" Mar 20 16:20:17 crc kubenswrapper[4675]: I0320 16:20:17.247214 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.388698 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.390140 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.393488 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.393665 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nktxc" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.394060 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.396124 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.401849 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552493 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkd9\" (UniqueName: \"kubernetes.io/projected/47fb8c80-d4bd-42fb-bcc3-752f854574b4-kube-api-access-mrkd9\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552564 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552595 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552626 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47fb8c80-d4bd-42fb-bcc3-752f854574b4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552655 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47fb8c80-d4bd-42fb-bcc3-752f854574b4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552690 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47fb8c80-d4bd-42fb-bcc3-752f854574b4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552716 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.552784 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: W0320 16:20:18.590288 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91e693b_6453_443b_8211_6a29b07c91bd.slice/crio-dfabfb4ed8716099179062a40bfdc8bbe9b9fd6a0320af48b958db24b47bf033 WatchSource:0}: Error finding container dfabfb4ed8716099179062a40bfdc8bbe9b9fd6a0320af48b958db24b47bf033: Status 404 returned error can't find the container with id dfabfb4ed8716099179062a40bfdc8bbe9b9fd6a0320af48b958db24b47bf033 Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.653822 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47fb8c80-d4bd-42fb-bcc3-752f854574b4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.653897 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47fb8c80-d4bd-42fb-bcc3-752f854574b4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654421 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47fb8c80-d4bd-42fb-bcc3-752f854574b4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654466 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47fb8c80-d4bd-42fb-bcc3-752f854574b4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654493 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654537 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654585 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkd9\" (UniqueName: \"kubernetes.io/projected/47fb8c80-d4bd-42fb-bcc3-752f854574b4-kube-api-access-mrkd9\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654608 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.654822 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.655354 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.657347 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.657582 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47fb8c80-d4bd-42fb-bcc3-752f854574b4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.662286 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47fb8c80-d4bd-42fb-bcc3-752f854574b4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.662438 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47fb8c80-d4bd-42fb-bcc3-752f854574b4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.677939 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkd9\" (UniqueName: \"kubernetes.io/projected/47fb8c80-d4bd-42fb-bcc3-752f854574b4-kube-api-access-mrkd9\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.704924 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"47fb8c80-d4bd-42fb-bcc3-752f854574b4\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.718311 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.719467 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.725387 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kxn8c" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.725639 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.725802 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.728608 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.763048 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.862783 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v59dm\" (UniqueName: \"kubernetes.io/projected/7667dc1e-d72a-4119-8e30-a8267d0149f4-kube-api-access-v59dm\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.862834 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667dc1e-d72a-4119-8e30-a8267d0149f4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.862892 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7667dc1e-d72a-4119-8e30-a8267d0149f4-kolla-config\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.862927 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7667dc1e-d72a-4119-8e30-a8267d0149f4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.862966 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7667dc1e-d72a-4119-8e30-a8267d0149f4-config-data\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.877937 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" event={"ID":"b91e693b-6453-443b-8211-6a29b07c91bd","Type":"ContainerStarted","Data":"dfabfb4ed8716099179062a40bfdc8bbe9b9fd6a0320af48b958db24b47bf033"} Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.963860 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7667dc1e-d72a-4119-8e30-a8267d0149f4-kolla-config\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.964210 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7667dc1e-d72a-4119-8e30-a8267d0149f4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.964256 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7667dc1e-d72a-4119-8e30-a8267d0149f4-config-data\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.964341 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v59dm\" (UniqueName: \"kubernetes.io/projected/7667dc1e-d72a-4119-8e30-a8267d0149f4-kube-api-access-v59dm\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.964376 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667dc1e-d72a-4119-8e30-a8267d0149f4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.965421 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7667dc1e-d72a-4119-8e30-a8267d0149f4-config-data\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.968402 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667dc1e-d72a-4119-8e30-a8267d0149f4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.981216 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7667dc1e-d72a-4119-8e30-a8267d0149f4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.984472 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7667dc1e-d72a-4119-8e30-a8267d0149f4-kolla-config\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:18 crc kubenswrapper[4675]: I0320 16:20:18.987864 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v59dm\" (UniqueName: \"kubernetes.io/projected/7667dc1e-d72a-4119-8e30-a8267d0149f4-kube-api-access-v59dm\") pod \"memcached-0\" (UID: \"7667dc1e-d72a-4119-8e30-a8267d0149f4\") " pod="openstack/memcached-0" Mar 20 16:20:19 crc kubenswrapper[4675]: I0320 16:20:19.076109 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.641828 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.644648 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.647396 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hmb7q" Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.649692 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.817930 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jv6m\" (UniqueName: \"kubernetes.io/projected/f8f8e55f-429c-43cf-9aea-9524bf3caac7-kube-api-access-6jv6m\") pod \"kube-state-metrics-0\" (UID: \"f8f8e55f-429c-43cf-9aea-9524bf3caac7\") " pod="openstack/kube-state-metrics-0" Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.919366 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jv6m\" (UniqueName: \"kubernetes.io/projected/f8f8e55f-429c-43cf-9aea-9524bf3caac7-kube-api-access-6jv6m\") pod \"kube-state-metrics-0\" (UID: \"f8f8e55f-429c-43cf-9aea-9524bf3caac7\") " pod="openstack/kube-state-metrics-0" Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.938752 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jv6m\" (UniqueName: \"kubernetes.io/projected/f8f8e55f-429c-43cf-9aea-9524bf3caac7-kube-api-access-6jv6m\") pod \"kube-state-metrics-0\" (UID: \"f8f8e55f-429c-43cf-9aea-9524bf3caac7\") " pod="openstack/kube-state-metrics-0" Mar 20 16:20:20 crc kubenswrapper[4675]: I0320 16:20:20.974189 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:20:21 crc kubenswrapper[4675]: I0320 16:20:21.819809 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.288634 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b5cf6"] Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.289830 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.292245 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.292360 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xdvd5" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.292729 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.312095 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b5cf6"] Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.382357 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-prjxt"] Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.383941 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.411113 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-prjxt"] Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472640 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l475\" (UniqueName: \"kubernetes.io/projected/b563a826-d7ed-453e-89f6-aec33699291e-kube-api-access-7l475\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472704 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-etc-ovs\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472734 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrf5s\" (UniqueName: \"kubernetes.io/projected/6caea9dd-db8f-4f21-b684-5899258ff290-kube-api-access-wrf5s\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472761 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-run\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472812 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b563a826-d7ed-453e-89f6-aec33699291e-ovn-controller-tls-certs\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472832 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-run\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472845 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-log\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472875 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-run-ovn\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472894 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b563a826-d7ed-453e-89f6-aec33699291e-scripts\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472915 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-log-ovn\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472932 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6caea9dd-db8f-4f21-b684-5899258ff290-scripts\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472963 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-lib\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.472978 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b563a826-d7ed-453e-89f6-aec33699291e-combined-ca-bundle\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574069 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l475\" (UniqueName: \"kubernetes.io/projected/b563a826-d7ed-453e-89f6-aec33699291e-kube-api-access-7l475\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574115 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-etc-ovs\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574143 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrf5s\" (UniqueName: \"kubernetes.io/projected/6caea9dd-db8f-4f21-b684-5899258ff290-kube-api-access-wrf5s\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574168 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-run\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574200 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b563a826-d7ed-453e-89f6-aec33699291e-ovn-controller-tls-certs\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574224 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-run\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574242 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-log\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574279 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-run-ovn\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574303 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b563a826-d7ed-453e-89f6-aec33699291e-scripts\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574326 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-log-ovn\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574347 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6caea9dd-db8f-4f21-b684-5899258ff290-scripts\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574390 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-lib\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.574412 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b563a826-d7ed-453e-89f6-aec33699291e-combined-ca-bundle\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.576545 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-log\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.576668 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-etc-ovs\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.576702 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-log-ovn\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.576923 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-lib\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.576943 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-run-ovn\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.580034 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b563a826-d7ed-453e-89f6-aec33699291e-var-run\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.580021 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6caea9dd-db8f-4f21-b684-5899258ff290-var-run\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.581729 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b563a826-d7ed-453e-89f6-aec33699291e-combined-ca-bundle\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.584326 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6caea9dd-db8f-4f21-b684-5899258ff290-scripts\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.593947 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b563a826-d7ed-453e-89f6-aec33699291e-ovn-controller-tls-certs\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.597428 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b563a826-d7ed-453e-89f6-aec33699291e-scripts\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.598591 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l475\" (UniqueName: \"kubernetes.io/projected/b563a826-d7ed-453e-89f6-aec33699291e-kube-api-access-7l475\") pod \"ovn-controller-b5cf6\" (UID: \"b563a826-d7ed-453e-89f6-aec33699291e\") " pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.599614 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrf5s\" (UniqueName: \"kubernetes.io/projected/6caea9dd-db8f-4f21-b684-5899258ff290-kube-api-access-wrf5s\") pod \"ovn-controller-ovs-prjxt\" (UID: \"6caea9dd-db8f-4f21-b684-5899258ff290\") " pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.608199 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:24 crc kubenswrapper[4675]: I0320 16:20:24.704205 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.138334 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.140310 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.145883 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.148138 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.148144 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8l52g" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.148194 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.148915 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.150039 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282321 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282363 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282416 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67c80b11-0763-4407-ad6b-5f1fef8ad591-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282441 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sq4q\" (UniqueName: \"kubernetes.io/projected/67c80b11-0763-4407-ad6b-5f1fef8ad591-kube-api-access-6sq4q\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282462 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67c80b11-0763-4407-ad6b-5f1fef8ad591-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282479 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282504 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.282524 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c80b11-0763-4407-ad6b-5f1fef8ad591-config\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384219 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67c80b11-0763-4407-ad6b-5f1fef8ad591-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384278 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sq4q\" (UniqueName: \"kubernetes.io/projected/67c80b11-0763-4407-ad6b-5f1fef8ad591-kube-api-access-6sq4q\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384311 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67c80b11-0763-4407-ad6b-5f1fef8ad591-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384336 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384370 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384402 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c80b11-0763-4407-ad6b-5f1fef8ad591-config\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384493 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384513 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.384760 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67c80b11-0763-4407-ad6b-5f1fef8ad591-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.386198 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c80b11-0763-4407-ad6b-5f1fef8ad591-config\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.386396 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67c80b11-0763-4407-ad6b-5f1fef8ad591-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.386519 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.389885 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.390343 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.403818 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c80b11-0763-4407-ad6b-5f1fef8ad591-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.406179 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sq4q\" (UniqueName: \"kubernetes.io/projected/67c80b11-0763-4407-ad6b-5f1fef8ad591-kube-api-access-6sq4q\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.411956 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67c80b11-0763-4407-ad6b-5f1fef8ad591\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:25 crc kubenswrapper[4675]: I0320 16:20:25.463148 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:26 crc kubenswrapper[4675]: I0320 16:20:26.046095 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:20:26 crc kubenswrapper[4675]: I0320 16:20:26.465784 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 16:20:26 crc kubenswrapper[4675]: W0320 16:20:26.875903 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacbf924d_7363_4489_a64a_51c2949a2a69.slice/crio-e5112698ffcd15661e9c2fc33e58af270f42f04a3d76e6a65a958c3a34d9895b WatchSource:0}: Error finding container e5112698ffcd15661e9c2fc33e58af270f42f04a3d76e6a65a958c3a34d9895b: Status 404 returned error can't find the container with id e5112698ffcd15661e9c2fc33e58af270f42f04a3d76e6a65a958c3a34d9895b Mar 20 16:20:26 crc kubenswrapper[4675]: E0320 16:20:26.897692 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 16:20:26 crc kubenswrapper[4675]: E0320 16:20:26.897876 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7w4m8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-5lrzg_openstack(b590f984-fe7e-4e88-a997-12ea0653abc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:20:26 crc kubenswrapper[4675]: E0320 16:20:26.899205 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" podUID="b590f984-fe7e-4e88-a997-12ea0653abc2" Mar 20 16:20:26 crc kubenswrapper[4675]: E0320 16:20:26.913508 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 16:20:26 crc kubenswrapper[4675]: E0320 16:20:26.913682 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b2knp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-m74dd_openstack(e0f3d2f9-afe2-4551-8bac-25b7a220ecca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:20:26 crc kubenswrapper[4675]: E0320 16:20:26.914902 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" podUID="e0f3d2f9-afe2-4551-8bac-25b7a220ecca" Mar 20 16:20:26 crc kubenswrapper[4675]: I0320 16:20:26.954349 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87f2f4be-70c8-409a-8fe8-c753758021f4","Type":"ContainerStarted","Data":"99e8106d1b22d92d5c69be050de536b3c1f7bf3320e4780fa180c567f7c3d699"} Mar 20 16:20:26 crc kubenswrapper[4675]: I0320 16:20:26.956129 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"acbf924d-7363-4489-a64a-51c2949a2a69","Type":"ContainerStarted","Data":"e5112698ffcd15661e9c2fc33e58af270f42f04a3d76e6a65a958c3a34d9895b"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.306264 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.307942 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.309640 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zfwqp" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.310298 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.310819 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.311467 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.323064 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.397587 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b5cf6"] Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.413151 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.419503 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 16:20:27 crc kubenswrapper[4675]: W0320 16:20:27.445479 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f8e55f_429c_43cf_9aea_9524bf3caac7.slice/crio-8472170ec9b8397231a9946daca64413562223bcbe0568878063b914a887da36 WatchSource:0}: Error finding container 8472170ec9b8397231a9946daca64413562223bcbe0568878063b914a887da36: Status 404 returned error can't find the container with id 8472170ec9b8397231a9946daca64413562223bcbe0568878063b914a887da36 Mar 20 16:20:27 crc kubenswrapper[4675]: W0320 16:20:27.448091 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2786789_8885_42c4_9127_c0466e2212eb.slice/crio-0b3e3c413ba45038b357fc17a700cedc6d51532cc30583ab8cd963262d55927c WatchSource:0}: Error finding container 0b3e3c413ba45038b357fc17a700cedc6d51532cc30583ab8cd963262d55927c: Status 404 returned error can't find the container with id 0b3e3c413ba45038b357fc17a700cedc6d51532cc30583ab8cd963262d55927c Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454503 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454549 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5vfv\" (UniqueName: \"kubernetes.io/projected/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-kube-api-access-g5vfv\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454579 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454752 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454855 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-config\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454918 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.454938 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.455123 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.554983 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.556469 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.556514 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5vfv\" (UniqueName: \"kubernetes.io/projected/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-kube-api-access-g5vfv\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.556584 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.556905 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.557094 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-config\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.557783 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.557842 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.558310 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-config\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.557861 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.558503 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.558621 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.559626 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.571174 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.575112 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.576461 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.576966 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.582214 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5vfv\" (UniqueName: \"kubernetes.io/projected/1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc-kube-api-access-g5vfv\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.605961 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.625555 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.636154 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 16:20:27 crc kubenswrapper[4675]: W0320 16:20:27.656363 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47fb8c80_d4bd_42fb_bcc3_752f854574b4.slice/crio-8759f85eca26e2f2e9e35db7db6c66cd1827429f9aff727c55f4d809e1d6850b WatchSource:0}: Error finding container 8759f85eca26e2f2e9e35db7db6c66cd1827429f9aff727c55f4d809e1d6850b: Status 404 returned error can't find the container with id 8759f85eca26e2f2e9e35db7db6c66cd1827429f9aff727c55f4d809e1d6850b Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.659535 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2knp\" (UniqueName: \"kubernetes.io/projected/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-kube-api-access-b2knp\") pod \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.659698 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-config\") pod \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\" (UID: \"e0f3d2f9-afe2-4551-8bac-25b7a220ecca\") " Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.661460 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-config" (OuterVolumeSpecName: "config") pod "e0f3d2f9-afe2-4551-8bac-25b7a220ecca" (UID: "e0f3d2f9-afe2-4551-8bac-25b7a220ecca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.664432 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-kube-api-access-b2knp" (OuterVolumeSpecName: "kube-api-access-b2knp") pod "e0f3d2f9-afe2-4551-8bac-25b7a220ecca" (UID: "e0f3d2f9-afe2-4551-8bac-25b7a220ecca"). InnerVolumeSpecName "kube-api-access-b2knp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.665342 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.761904 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-config\") pod \"b590f984-fe7e-4e88-a997-12ea0653abc2\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.761979 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w4m8\" (UniqueName: \"kubernetes.io/projected/b590f984-fe7e-4e88-a997-12ea0653abc2-kube-api-access-7w4m8\") pod \"b590f984-fe7e-4e88-a997-12ea0653abc2\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.762071 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-dns-svc\") pod \"b590f984-fe7e-4e88-a997-12ea0653abc2\" (UID: \"b590f984-fe7e-4e88-a997-12ea0653abc2\") " Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.763306 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-config" (OuterVolumeSpecName: "config") pod "b590f984-fe7e-4e88-a997-12ea0653abc2" (UID: "b590f984-fe7e-4e88-a997-12ea0653abc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.763888 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b590f984-fe7e-4e88-a997-12ea0653abc2" (UID: "b590f984-fe7e-4e88-a997-12ea0653abc2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.764285 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.764309 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.764321 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b590f984-fe7e-4e88-a997-12ea0653abc2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.764335 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2knp\" (UniqueName: \"kubernetes.io/projected/e0f3d2f9-afe2-4551-8bac-25b7a220ecca-kube-api-access-b2knp\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.766330 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b590f984-fe7e-4e88-a997-12ea0653abc2-kube-api-access-7w4m8" (OuterVolumeSpecName: "kube-api-access-7w4m8") pod "b590f984-fe7e-4e88-a997-12ea0653abc2" (UID: "b590f984-fe7e-4e88-a997-12ea0653abc2"). InnerVolumeSpecName "kube-api-access-7w4m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.791372 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-prjxt"] Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.873033 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w4m8\" (UniqueName: \"kubernetes.io/projected/b590f984-fe7e-4e88-a997-12ea0653abc2-kube-api-access-7w4m8\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.967211 4675 generic.go:334] "Generic (PLEG): container finished" podID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerID="c3ada32ad62a8b28b2f7af440997ea46a274ccbf053ae427f5b46f61a43ebe45" exitCode=0 Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.967335 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" event={"ID":"f35cdc3f-cbcf-46b5-8988-a077a4e65284","Type":"ContainerDied","Data":"c3ada32ad62a8b28b2f7af440997ea46a274ccbf053ae427f5b46f61a43ebe45"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.969979 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-prjxt" event={"ID":"6caea9dd-db8f-4f21-b684-5899258ff290","Type":"ContainerStarted","Data":"dcafb4bf863e4908c30ba84e2895e4254e88088dd9b14c92a965d9f0c3b787cc"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.971106 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7667dc1e-d72a-4119-8e30-a8267d0149f4","Type":"ContainerStarted","Data":"51e6780338f1f98c184128624902f1d1802ac45edca4d8d3249a2d36e96623e7"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.972667 4675 generic.go:334] "Generic (PLEG): container finished" podID="b91e693b-6453-443b-8211-6a29b07c91bd" containerID="b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85" exitCode=0 Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.972722 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" event={"ID":"b91e693b-6453-443b-8211-6a29b07c91bd","Type":"ContainerDied","Data":"b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.976456 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8f8e55f-429c-43cf-9aea-9524bf3caac7","Type":"ContainerStarted","Data":"8472170ec9b8397231a9946daca64413562223bcbe0568878063b914a887da36"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.987252 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2786789-8885-42c4-9127-c0466e2212eb","Type":"ContainerStarted","Data":"0b3e3c413ba45038b357fc17a700cedc6d51532cc30583ab8cd963262d55927c"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.988692 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" event={"ID":"e0f3d2f9-afe2-4551-8bac-25b7a220ecca","Type":"ContainerDied","Data":"08811964d78f6273bea97110fb5659ad367cd04043fb7cdb1b2944e0e038f580"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.988785 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m74dd" Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.990367 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"47fb8c80-d4bd-42fb-bcc3-752f854574b4","Type":"ContainerStarted","Data":"8759f85eca26e2f2e9e35db7db6c66cd1827429f9aff727c55f4d809e1d6850b"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.992500 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6" event={"ID":"b563a826-d7ed-453e-89f6-aec33699291e","Type":"ContainerStarted","Data":"04b47e5c37819514659023242cdb57604b2d9a5abde814de69c6028a6d50fbd9"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.994888 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" event={"ID":"b590f984-fe7e-4e88-a997-12ea0653abc2","Type":"ContainerDied","Data":"8288a460322f703c4ccf04bc289e4ca8645e9bdc6b2b5cd0994deedd8c991d98"} Mar 20 16:20:27 crc kubenswrapper[4675]: I0320 16:20:27.994956 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5lrzg" Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.053417 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m74dd"] Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.061837 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m74dd"] Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.080156 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5lrzg"] Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.087541 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5lrzg"] Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.168202 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 16:20:28 crc kubenswrapper[4675]: W0320 16:20:28.172566 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e0a38ce_32e7_4af9_a6d3_a2ed52e644cc.slice/crio-e610d93cc5a6ed59507519d13ae0cdadf7e259191e7a32507d7a1d91edcb5640 WatchSource:0}: Error finding container e610d93cc5a6ed59507519d13ae0cdadf7e259191e7a32507d7a1d91edcb5640: Status 404 returned error can't find the container with id e610d93cc5a6ed59507519d13ae0cdadf7e259191e7a32507d7a1d91edcb5640 Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.632575 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.692874 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b590f984-fe7e-4e88-a997-12ea0653abc2" path="/var/lib/kubelet/pods/b590f984-fe7e-4e88-a997-12ea0653abc2/volumes" Mar 20 16:20:28 crc kubenswrapper[4675]: I0320 16:20:28.693472 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f3d2f9-afe2-4551-8bac-25b7a220ecca" path="/var/lib/kubelet/pods/e0f3d2f9-afe2-4551-8bac-25b7a220ecca/volumes" Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.007819 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc","Type":"ContainerStarted","Data":"e610d93cc5a6ed59507519d13ae0cdadf7e259191e7a32507d7a1d91edcb5640"} Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.011336 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" event={"ID":"b91e693b-6453-443b-8211-6a29b07c91bd","Type":"ContainerStarted","Data":"23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4"} Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.011900 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.014234 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67c80b11-0763-4407-ad6b-5f1fef8ad591","Type":"ContainerStarted","Data":"4e495fa062f0342f972a983ad8d1e1a7b1d8bc28de2772d15b224c0403665533"} Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.017394 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" event={"ID":"f35cdc3f-cbcf-46b5-8988-a077a4e65284","Type":"ContainerStarted","Data":"191c55d7a1d3694651818e4275e489b4dd35198059da5695a7af2b5eaeb0fff5"} Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.018567 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.032797 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" podStartSLOduration=6.625414137 podStartE2EDuration="15.032781412s" podCreationTimestamp="2026-03-20 16:20:14 +0000 UTC" firstStartedPulling="2026-03-20 16:20:18.594599705 +0000 UTC m=+1138.628229242" lastFinishedPulling="2026-03-20 16:20:27.00196698 +0000 UTC m=+1147.035596517" observedRunningTime="2026-03-20 16:20:29.024358943 +0000 UTC m=+1149.057988480" watchObservedRunningTime="2026-03-20 16:20:29.032781412 +0000 UTC m=+1149.066410939" Mar 20 16:20:29 crc kubenswrapper[4675]: I0320 16:20:29.051544 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" podStartSLOduration=3.204165146 podStartE2EDuration="15.051529326s" podCreationTimestamp="2026-03-20 16:20:14 +0000 UTC" firstStartedPulling="2026-03-20 16:20:15.146919512 +0000 UTC m=+1135.180549049" lastFinishedPulling="2026-03-20 16:20:26.994283692 +0000 UTC m=+1147.027913229" observedRunningTime="2026-03-20 16:20:29.048378806 +0000 UTC m=+1149.082008343" watchObservedRunningTime="2026-03-20 16:20:29.051529326 +0000 UTC m=+1149.085158853" Mar 20 16:20:34 crc kubenswrapper[4675]: I0320 16:20:34.425196 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:20:34 crc kubenswrapper[4675]: I0320 16:20:34.425707 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:20:34 crc kubenswrapper[4675]: I0320 16:20:34.724923 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:34 crc kubenswrapper[4675]: I0320 16:20:34.940073 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:34 crc kubenswrapper[4675]: I0320 16:20:34.996542 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-t7w6q"] Mar 20 16:20:35 crc kubenswrapper[4675]: I0320 16:20:35.063904 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="dnsmasq-dns" containerID="cri-o://191c55d7a1d3694651818e4275e489b4dd35198059da5695a7af2b5eaeb0fff5" gracePeriod=10 Mar 20 16:20:36 crc kubenswrapper[4675]: I0320 16:20:36.072470 4675 generic.go:334] "Generic (PLEG): container finished" podID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerID="191c55d7a1d3694651818e4275e489b4dd35198059da5695a7af2b5eaeb0fff5" exitCode=0 Mar 20 16:20:36 crc kubenswrapper[4675]: I0320 16:20:36.072784 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" event={"ID":"f35cdc3f-cbcf-46b5-8988-a077a4e65284","Type":"ContainerDied","Data":"191c55d7a1d3694651818e4275e489b4dd35198059da5695a7af2b5eaeb0fff5"} Mar 20 16:20:36 crc kubenswrapper[4675]: I0320 16:20:36.687353 4675 scope.go:117] "RemoveContainer" containerID="46da6bc5cd6b0d528789f8622586a747fc5ba1123258d1379dc1ae6321c4a71f" Mar 20 16:20:39 crc kubenswrapper[4675]: I0320 16:20:39.602903 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Mar 20 16:20:44 crc kubenswrapper[4675]: I0320 16:20:44.602969 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Mar 20 16:20:47 crc kubenswrapper[4675]: E0320 16:20:47.528708 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Mar 20 16:20:47 crc kubenswrapper[4675]: E0320 16:20:47.529310 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57bh675h78h5b4hb4h676hc6h6dh5c7hd7h89h8h576hdbh56dhc7h85h55bh559h587hdfh5ddh66fh665h68dh7fh65ch98h67dh65chc9h5b4q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6sq4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(67c80b11-0763-4407-ad6b-5f1fef8ad591): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.531120 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-np69v"] Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.532101 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.535354 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.552232 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-np69v"] Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.676163 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks2w8"] Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.677643 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.679712 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.692078 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks2w8"] Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.727152 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/694ef288-9c84-4800-8f0c-aa30aa0c74a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.727232 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/694ef288-9c84-4800-8f0c-aa30aa0c74a0-ovn-rundir\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.727290 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jlv\" (UniqueName: \"kubernetes.io/projected/694ef288-9c84-4800-8f0c-aa30aa0c74a0-kube-api-access-t6jlv\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.727332 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694ef288-9c84-4800-8f0c-aa30aa0c74a0-config\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.727400 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694ef288-9c84-4800-8f0c-aa30aa0c74a0-combined-ca-bundle\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.727424 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/694ef288-9c84-4800-8f0c-aa30aa0c74a0-ovs-rundir\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.829143 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694ef288-9c84-4800-8f0c-aa30aa0c74a0-combined-ca-bundle\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.829197 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.829217 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/694ef288-9c84-4800-8f0c-aa30aa0c74a0-ovs-rundir\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.829259 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-config\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830003 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830126 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/694ef288-9c84-4800-8f0c-aa30aa0c74a0-ovs-rundir\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830186 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/694ef288-9c84-4800-8f0c-aa30aa0c74a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830313 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/694ef288-9c84-4800-8f0c-aa30aa0c74a0-ovn-rundir\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830390 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7dg\" (UniqueName: \"kubernetes.io/projected/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-kube-api-access-wd7dg\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830457 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/694ef288-9c84-4800-8f0c-aa30aa0c74a0-ovn-rundir\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830534 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jlv\" (UniqueName: \"kubernetes.io/projected/694ef288-9c84-4800-8f0c-aa30aa0c74a0-kube-api-access-t6jlv\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.830619 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694ef288-9c84-4800-8f0c-aa30aa0c74a0-config\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.832125 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694ef288-9c84-4800-8f0c-aa30aa0c74a0-config\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.835807 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/694ef288-9c84-4800-8f0c-aa30aa0c74a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.836219 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694ef288-9c84-4800-8f0c-aa30aa0c74a0-combined-ca-bundle\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.853104 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jlv\" (UniqueName: \"kubernetes.io/projected/694ef288-9c84-4800-8f0c-aa30aa0c74a0-kube-api-access-t6jlv\") pod \"ovn-controller-metrics-np69v\" (UID: \"694ef288-9c84-4800-8f0c-aa30aa0c74a0\") " pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.900783 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-np69v" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.931745 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.931798 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-config\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.931830 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.931881 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7dg\" (UniqueName: \"kubernetes.io/projected/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-kube-api-access-wd7dg\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.932499 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.932726 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-config\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.933657 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.952544 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7dg\" (UniqueName: \"kubernetes.io/projected/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-kube-api-access-wd7dg\") pod \"dnsmasq-dns-7fd796d7df-ks2w8\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.972681 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks2w8"] Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.973321 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.996224 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l56nr"] Mar 20 16:20:47 crc kubenswrapper[4675]: I0320 16:20:47.999646 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.003131 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.017471 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l56nr"] Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.139392 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.139477 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-config\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.139507 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.139536 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.139580 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgkbr\" (UniqueName: \"kubernetes.io/projected/d1a33921-a0a7-417f-bcde-05c6af9baa10-kube-api-access-zgkbr\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.241715 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.241791 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.241837 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgkbr\" (UniqueName: \"kubernetes.io/projected/d1a33921-a0a7-417f-bcde-05c6af9baa10-kube-api-access-zgkbr\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.241942 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.241989 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-config\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.243463 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-config\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.243587 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.244152 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.244336 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.259663 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgkbr\" (UniqueName: \"kubernetes.io/projected/d1a33921-a0a7-417f-bcde-05c6af9baa10-kube-api-access-zgkbr\") pod \"dnsmasq-dns-86db49b7ff-l56nr\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:48 crc kubenswrapper[4675]: I0320 16:20:48.321223 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:49 crc kubenswrapper[4675]: E0320 16:20:49.371916 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 16:20:49 crc kubenswrapper[4675]: E0320 16:20:49.372244 4675 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:20:49 crc kubenswrapper[4675]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 16:20:49 crc kubenswrapper[4675]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 16:20:49 crc kubenswrapper[4675]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 16:20:49 crc kubenswrapper[4675]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 16:20:49 crc kubenswrapper[4675]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 16:20:49 crc kubenswrapper[4675]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 16:20:49 crc kubenswrapper[4675]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 16:20:49 crc kubenswrapper[4675]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 16:20:49 crc kubenswrapper[4675]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dc25j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(87f2f4be-70c8-409a-8fe8-c753758021f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 16:20:49 crc kubenswrapper[4675]: > logger="UnhandledError" Mar 20 16:20:49 crc kubenswrapper[4675]: E0320 16:20:49.373416 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="87f2f4be-70c8-409a-8fe8-c753758021f4" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.385500 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:49 crc kubenswrapper[4675]: E0320 16:20:49.467669 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 16:20:49 crc kubenswrapper[4675]: E0320 16:20:49.467829 4675 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:20:49 crc kubenswrapper[4675]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 16:20:49 crc kubenswrapper[4675]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 16:20:49 crc kubenswrapper[4675]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 16:20:49 crc kubenswrapper[4675]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 16:20:49 crc kubenswrapper[4675]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 16:20:49 crc kubenswrapper[4675]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 16:20:49 crc kubenswrapper[4675]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 16:20:49 crc kubenswrapper[4675]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 16:20:49 crc kubenswrapper[4675]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n6lgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(f2786789-8885-42c4-9127-c0466e2212eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 16:20:49 crc kubenswrapper[4675]: > logger="UnhandledError" Mar 20 16:20:49 crc kubenswrapper[4675]: E0320 16:20:49.468919 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f2786789-8885-42c4-9127-c0466e2212eb" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.570152 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sl5z\" (UniqueName: \"kubernetes.io/projected/f35cdc3f-cbcf-46b5-8988-a077a4e65284-kube-api-access-2sl5z\") pod \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.570197 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-dns-svc\") pod \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.570365 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-config\") pod \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\" (UID: \"f35cdc3f-cbcf-46b5-8988-a077a4e65284\") " Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.574268 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35cdc3f-cbcf-46b5-8988-a077a4e65284-kube-api-access-2sl5z" (OuterVolumeSpecName: "kube-api-access-2sl5z") pod "f35cdc3f-cbcf-46b5-8988-a077a4e65284" (UID: "f35cdc3f-cbcf-46b5-8988-a077a4e65284"). InnerVolumeSpecName "kube-api-access-2sl5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.607423 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f35cdc3f-cbcf-46b5-8988-a077a4e65284" (UID: "f35cdc3f-cbcf-46b5-8988-a077a4e65284"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.616629 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-config" (OuterVolumeSpecName: "config") pod "f35cdc3f-cbcf-46b5-8988-a077a4e65284" (UID: "f35cdc3f-cbcf-46b5-8988-a077a4e65284"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.672140 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sl5z\" (UniqueName: \"kubernetes.io/projected/f35cdc3f-cbcf-46b5-8988-a077a4e65284-kube-api-access-2sl5z\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.672172 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:49 crc kubenswrapper[4675]: I0320 16:20:49.672181 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35cdc3f-cbcf-46b5-8988-a077a4e65284-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.136154 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-np69v"] Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.184255 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" event={"ID":"f35cdc3f-cbcf-46b5-8988-a077a4e65284","Type":"ContainerDied","Data":"0d60f7b72d9e28a516e5a7196ff480147dba0c8f6e9f74acd9ef0755f1a29e29"} Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.184279 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-t7w6q" Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.184311 4675 scope.go:117] "RemoveContainer" containerID="191c55d7a1d3694651818e4275e489b4dd35198059da5695a7af2b5eaeb0fff5" Mar 20 16:20:50 crc kubenswrapper[4675]: E0320 16:20:50.185815 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="87f2f4be-70c8-409a-8fe8-c753758021f4" Mar 20 16:20:50 crc kubenswrapper[4675]: E0320 16:20:50.185816 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f2786789-8885-42c4-9127-c0466e2212eb" Mar 20 16:20:50 crc kubenswrapper[4675]: W0320 16:20:50.225473 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod694ef288_9c84_4800_8f0c_aa30aa0c74a0.slice/crio-4abc72d99cf31cd1f98aefd667aab9001a7f52dfbcc48b146beb713a269de6c3 WatchSource:0}: Error finding container 4abc72d99cf31cd1f98aefd667aab9001a7f52dfbcc48b146beb713a269de6c3: Status 404 returned error can't find the container with id 4abc72d99cf31cd1f98aefd667aab9001a7f52dfbcc48b146beb713a269de6c3 Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.265487 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-t7w6q"] Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.271177 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-t7w6q"] Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.277984 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks2w8"] Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.319096 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l56nr"] Mar 20 16:20:50 crc kubenswrapper[4675]: W0320 16:20:50.357309 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a33921_a0a7_417f_bcde_05c6af9baa10.slice/crio-7e63d0f0017cd72d4c9121d4bf36436609252f43b64547e513b475937b74316d WatchSource:0}: Error finding container 7e63d0f0017cd72d4c9121d4bf36436609252f43b64547e513b475937b74316d: Status 404 returned error can't find the container with id 7e63d0f0017cd72d4c9121d4bf36436609252f43b64547e513b475937b74316d Mar 20 16:20:50 crc kubenswrapper[4675]: W0320 16:20:50.361005 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e98f660_63bc_4b67_ba1d_1bb9da72aa6f.slice/crio-d1a73dcf407c89894f552e97afc80cfb527e66944528b4bd8eab937445acbd56 WatchSource:0}: Error finding container d1a73dcf407c89894f552e97afc80cfb527e66944528b4bd8eab937445acbd56: Status 404 returned error can't find the container with id d1a73dcf407c89894f552e97afc80cfb527e66944528b4bd8eab937445acbd56 Mar 20 16:20:50 crc kubenswrapper[4675]: I0320 16:20:50.685968 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" path="/var/lib/kubelet/pods/f35cdc3f-cbcf-46b5-8988-a077a4e65284/volumes" Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.041690 4675 scope.go:117] "RemoveContainer" containerID="c3ada32ad62a8b28b2f7af440997ea46a274ccbf053ae427f5b46f61a43ebe45" Mar 20 16:20:51 crc kubenswrapper[4675]: E0320 16:20:51.073567 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 16:20:51 crc kubenswrapper[4675]: E0320 16:20:51.073627 4675 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 16:20:51 crc kubenswrapper[4675]: E0320 16:20:51.073804 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6jv6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(f8f8e55f-429c-43cf-9aea-9524bf3caac7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:20:51 crc kubenswrapper[4675]: E0320 16:20:51.074986 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.196075 4675 generic.go:334] "Generic (PLEG): container finished" podID="6caea9dd-db8f-4f21-b684-5899258ff290" containerID="86174b4bc7dbb092561bfb8960f35361a008fe77bb9ec9283263a8f45a5673a6" exitCode=0 Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.196162 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-prjxt" event={"ID":"6caea9dd-db8f-4f21-b684-5899258ff290","Type":"ContainerDied","Data":"86174b4bc7dbb092561bfb8960f35361a008fe77bb9ec9283263a8f45a5673a6"} Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.198036 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" event={"ID":"d1a33921-a0a7-417f-bcde-05c6af9baa10","Type":"ContainerStarted","Data":"7e63d0f0017cd72d4c9121d4bf36436609252f43b64547e513b475937b74316d"} Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.200015 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" event={"ID":"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f","Type":"ContainerStarted","Data":"d1a73dcf407c89894f552e97afc80cfb527e66944528b4bd8eab937445acbd56"} Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.203827 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7667dc1e-d72a-4119-8e30-a8267d0149f4","Type":"ContainerStarted","Data":"d7bec29501dceeee462eaf119a8e40087edcff54fb1c9a3719d07c85e5058a52"} Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.204421 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.205904 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-np69v" event={"ID":"694ef288-9c84-4800-8f0c-aa30aa0c74a0","Type":"ContainerStarted","Data":"4abc72d99cf31cd1f98aefd667aab9001a7f52dfbcc48b146beb713a269de6c3"} Mar 20 16:20:51 crc kubenswrapper[4675]: E0320 16:20:51.210649 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" Mar 20 16:20:51 crc kubenswrapper[4675]: I0320 16:20:51.294265 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.505929244 podStartE2EDuration="33.294246467s" podCreationTimestamp="2026-03-20 16:20:18 +0000 UTC" firstStartedPulling="2026-03-20 16:20:27.678212225 +0000 UTC m=+1147.711841762" lastFinishedPulling="2026-03-20 16:20:37.466529448 +0000 UTC m=+1157.500158985" observedRunningTime="2026-03-20 16:20:51.291210241 +0000 UTC m=+1171.324839788" watchObservedRunningTime="2026-03-20 16:20:51.294246467 +0000 UTC m=+1171.327876004" Mar 20 16:20:52 crc kubenswrapper[4675]: I0320 16:20:52.225621 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"acbf924d-7363-4489-a64a-51c2949a2a69","Type":"ContainerStarted","Data":"d9278c8db3d89c2706a94c22b86ff6fe071b563a13a884a2d152ca1fd7150b19"} Mar 20 16:20:53 crc kubenswrapper[4675]: E0320 16:20:53.214635 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="67c80b11-0763-4407-ad6b-5f1fef8ad591" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.236600 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67c80b11-0763-4407-ad6b-5f1fef8ad591","Type":"ContainerStarted","Data":"ae19d9ff56c7d4ae349b3b503dc5ad2b5511fa5f1f6d08ce6dd67ff9eb9f48b2"} Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.238474 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"47fb8c80-d4bd-42fb-bcc3-752f854574b4","Type":"ContainerStarted","Data":"d84c888406f34fc2ad7bc676a94257e95cf190b9b2c646e28a7f248dfd2891fb"} Mar 20 16:20:53 crc kubenswrapper[4675]: E0320 16:20:53.239205 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="67c80b11-0763-4407-ad6b-5f1fef8ad591" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.243478 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6" event={"ID":"b563a826-d7ed-453e-89f6-aec33699291e","Type":"ContainerStarted","Data":"a18eb7e7a2784570260a82e7526f0af2c1a8dc03af1a55fc76702e3bf27426e0"} Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.244070 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-b5cf6" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.245800 4675 generic.go:334] "Generic (PLEG): container finished" podID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerID="64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa" exitCode=0 Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.245993 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" event={"ID":"d1a33921-a0a7-417f-bcde-05c6af9baa10","Type":"ContainerDied","Data":"64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa"} Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.247642 4675 generic.go:334] "Generic (PLEG): container finished" podID="9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" containerID="4d4e566533bc20e685050bfb49c319a3798a2597ee04d2649ca7f2297e0fe08e" exitCode=0 Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.247855 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" event={"ID":"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f","Type":"ContainerDied","Data":"4d4e566533bc20e685050bfb49c319a3798a2597ee04d2649ca7f2297e0fe08e"} Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.250702 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc","Type":"ContainerStarted","Data":"d5360e665b62732ad76f04f6199b449718d5c938e3a3a438aabd7301b543f4b5"} Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.364209 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-b5cf6" podStartSLOduration=7.263655424 podStartE2EDuration="29.364186651s" podCreationTimestamp="2026-03-20 16:20:24 +0000 UTC" firstStartedPulling="2026-03-20 16:20:27.436551594 +0000 UTC m=+1147.470181131" lastFinishedPulling="2026-03-20 16:20:49.537082821 +0000 UTC m=+1169.570712358" observedRunningTime="2026-03-20 16:20:53.355157806 +0000 UTC m=+1173.388787343" watchObservedRunningTime="2026-03-20 16:20:53.364186651 +0000 UTC m=+1173.397816188" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.545322 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.654494 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-dns-svc\") pod \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.654908 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-config\") pod \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.654955 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-ovsdbserver-nb\") pod \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.655090 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd7dg\" (UniqueName: \"kubernetes.io/projected/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-kube-api-access-wd7dg\") pod \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\" (UID: \"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f\") " Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.661832 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-kube-api-access-wd7dg" (OuterVolumeSpecName: "kube-api-access-wd7dg") pod "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" (UID: "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f"). InnerVolumeSpecName "kube-api-access-wd7dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.710266 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" (UID: "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.710262 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" (UID: "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.711308 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-config" (OuterVolumeSpecName: "config") pod "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" (UID: "9e98f660-63bc-4b67-ba1d-1bb9da72aa6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.757757 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd7dg\" (UniqueName: \"kubernetes.io/projected/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-kube-api-access-wd7dg\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.757813 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.757825 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:53 crc kubenswrapper[4675]: I0320 16:20:53.757836 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.265853 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-prjxt" event={"ID":"6caea9dd-db8f-4f21-b684-5899258ff290","Type":"ContainerStarted","Data":"b7e4732c1b97f805b803129b119372966cb354da3193589cf7aba493ef1c5d16"} Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.266262 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.266320 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.266340 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-prjxt" event={"ID":"6caea9dd-db8f-4f21-b684-5899258ff290","Type":"ContainerStarted","Data":"8ba33cc39750b40c0a499eb2bfcfb199a767215977eb5f609be7633ccf27f7c6"} Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.271794 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" event={"ID":"d1a33921-a0a7-417f-bcde-05c6af9baa10","Type":"ContainerStarted","Data":"33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626"} Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.271988 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.273490 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.273449 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks2w8" event={"ID":"9e98f660-63bc-4b67-ba1d-1bb9da72aa6f","Type":"ContainerDied","Data":"d1a73dcf407c89894f552e97afc80cfb527e66944528b4bd8eab937445acbd56"} Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.273778 4675 scope.go:117] "RemoveContainer" containerID="4d4e566533bc20e685050bfb49c319a3798a2597ee04d2649ca7f2297e0fe08e" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.275282 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc","Type":"ContainerStarted","Data":"9be3961b0ae6e0d53eb2686e64cd4ce65cbf1210e879fcecf9377739e70a934c"} Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.278349 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-np69v" event={"ID":"694ef288-9c84-4800-8f0c-aa30aa0c74a0","Type":"ContainerStarted","Data":"d1b80a7bcd1c751e5ac68ce98a1b4e77ca57acb174bad16cd5f166b93ec20c9f"} Mar 20 16:20:54 crc kubenswrapper[4675]: E0320 16:20:54.282031 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="67c80b11-0763-4407-ad6b-5f1fef8ad591" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.310696 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-prjxt" podStartSLOduration=8.659629494 podStartE2EDuration="30.310670643s" podCreationTimestamp="2026-03-20 16:20:24 +0000 UTC" firstStartedPulling="2026-03-20 16:20:27.784740528 +0000 UTC m=+1147.818370065" lastFinishedPulling="2026-03-20 16:20:49.435781667 +0000 UTC m=+1169.469411214" observedRunningTime="2026-03-20 16:20:54.298421386 +0000 UTC m=+1174.332050923" watchObservedRunningTime="2026-03-20 16:20:54.310670643 +0000 UTC m=+1174.344300180" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.349543 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.031881609 podStartE2EDuration="28.349485289s" podCreationTimestamp="2026-03-20 16:20:26 +0000 UTC" firstStartedPulling="2026-03-20 16:20:28.175312988 +0000 UTC m=+1148.208942525" lastFinishedPulling="2026-03-20 16:20:53.492916668 +0000 UTC m=+1173.526546205" observedRunningTime="2026-03-20 16:20:54.342899283 +0000 UTC m=+1174.376528840" watchObservedRunningTime="2026-03-20 16:20:54.349485289 +0000 UTC m=+1174.383114866" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.370136 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-np69v" podStartSLOduration=4.564907503 podStartE2EDuration="7.370119652s" podCreationTimestamp="2026-03-20 16:20:47 +0000 UTC" firstStartedPulling="2026-03-20 16:20:50.22938015 +0000 UTC m=+1170.263009687" lastFinishedPulling="2026-03-20 16:20:53.034592299 +0000 UTC m=+1173.068221836" observedRunningTime="2026-03-20 16:20:54.368551368 +0000 UTC m=+1174.402180915" watchObservedRunningTime="2026-03-20 16:20:54.370119652 +0000 UTC m=+1174.403749199" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.431913 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" podStartSLOduration=7.431894998 podStartE2EDuration="7.431894998s" podCreationTimestamp="2026-03-20 16:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:54.411671616 +0000 UTC m=+1174.445301153" watchObservedRunningTime="2026-03-20 16:20:54.431894998 +0000 UTC m=+1174.465524535" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.462199 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks2w8"] Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.467883 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks2w8"] Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.626560 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:54 crc kubenswrapper[4675]: I0320 16:20:54.682993 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" path="/var/lib/kubelet/pods/9e98f660-63bc-4b67-ba1d-1bb9da72aa6f/volumes" Mar 20 16:20:55 crc kubenswrapper[4675]: I0320 16:20:55.286547 4675 generic.go:334] "Generic (PLEG): container finished" podID="acbf924d-7363-4489-a64a-51c2949a2a69" containerID="d9278c8db3d89c2706a94c22b86ff6fe071b563a13a884a2d152ca1fd7150b19" exitCode=0 Mar 20 16:20:55 crc kubenswrapper[4675]: I0320 16:20:55.286670 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"acbf924d-7363-4489-a64a-51c2949a2a69","Type":"ContainerDied","Data":"d9278c8db3d89c2706a94c22b86ff6fe071b563a13a884a2d152ca1fd7150b19"} Mar 20 16:20:56 crc kubenswrapper[4675]: I0320 16:20:56.309441 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"acbf924d-7363-4489-a64a-51c2949a2a69","Type":"ContainerStarted","Data":"f114446cdc809153ecd1ca82d9d19cacfd55a302cf94de72a7ea9e4d9335b1ad"} Mar 20 16:20:56 crc kubenswrapper[4675]: I0320 16:20:56.332920 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.706707026 podStartE2EDuration="41.33290111s" podCreationTimestamp="2026-03-20 16:20:15 +0000 UTC" firstStartedPulling="2026-03-20 16:20:26.888016736 +0000 UTC m=+1146.921646273" lastFinishedPulling="2026-03-20 16:20:49.51421082 +0000 UTC m=+1169.547840357" observedRunningTime="2026-03-20 16:20:56.332297133 +0000 UTC m=+1176.365926710" watchObservedRunningTime="2026-03-20 16:20:56.33290111 +0000 UTC m=+1176.366530657" Mar 20 16:20:57 crc kubenswrapper[4675]: I0320 16:20:57.248081 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 16:20:57 crc kubenswrapper[4675]: I0320 16:20:57.248285 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 16:20:57 crc kubenswrapper[4675]: I0320 16:20:57.320991 4675 generic.go:334] "Generic (PLEG): container finished" podID="47fb8c80-d4bd-42fb-bcc3-752f854574b4" containerID="d84c888406f34fc2ad7bc676a94257e95cf190b9b2c646e28a7f248dfd2891fb" exitCode=0 Mar 20 16:20:57 crc kubenswrapper[4675]: I0320 16:20:57.321075 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"47fb8c80-d4bd-42fb-bcc3-752f854574b4","Type":"ContainerDied","Data":"d84c888406f34fc2ad7bc676a94257e95cf190b9b2c646e28a7f248dfd2891fb"} Mar 20 16:20:57 crc kubenswrapper[4675]: I0320 16:20:57.626885 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:57 crc kubenswrapper[4675]: I0320 16:20:57.663642 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.323050 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.334928 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"47fb8c80-d4bd-42fb-bcc3-752f854574b4","Type":"ContainerStarted","Data":"413b0349e9f5d10060a48026b787f529720ad113d9fd61548fa0b2005e10cdb3"} Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.385550 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ptlpx"] Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.385851 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" containerName="dnsmasq-dns" containerID="cri-o://23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4" gracePeriod=10 Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.394866 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.545561925 podStartE2EDuration="41.394843188s" podCreationTimestamp="2026-03-20 16:20:17 +0000 UTC" firstStartedPulling="2026-03-20 16:20:27.664978908 +0000 UTC m=+1147.698608455" lastFinishedPulling="2026-03-20 16:20:49.514260171 +0000 UTC m=+1169.547889718" observedRunningTime="2026-03-20 16:20:58.3821583 +0000 UTC m=+1178.415787857" watchObservedRunningTime="2026-03-20 16:20:58.394843188 +0000 UTC m=+1178.428472725" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.421110 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.764102 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.764159 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.956451 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.981401 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb2cj\" (UniqueName: \"kubernetes.io/projected/b91e693b-6453-443b-8211-6a29b07c91bd-kube-api-access-zb2cj\") pod \"b91e693b-6453-443b-8211-6a29b07c91bd\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.981497 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-config\") pod \"b91e693b-6453-443b-8211-6a29b07c91bd\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " Mar 20 16:20:58 crc kubenswrapper[4675]: I0320 16:20:58.981546 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-dns-svc\") pod \"b91e693b-6453-443b-8211-6a29b07c91bd\" (UID: \"b91e693b-6453-443b-8211-6a29b07c91bd\") " Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.003677 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91e693b-6453-443b-8211-6a29b07c91bd-kube-api-access-zb2cj" (OuterVolumeSpecName: "kube-api-access-zb2cj") pod "b91e693b-6453-443b-8211-6a29b07c91bd" (UID: "b91e693b-6453-443b-8211-6a29b07c91bd"). InnerVolumeSpecName "kube-api-access-zb2cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.022607 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b91e693b-6453-443b-8211-6a29b07c91bd" (UID: "b91e693b-6453-443b-8211-6a29b07c91bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.027344 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-config" (OuterVolumeSpecName: "config") pod "b91e693b-6453-443b-8211-6a29b07c91bd" (UID: "b91e693b-6453-443b-8211-6a29b07c91bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.078615 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.082893 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.082919 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91e693b-6453-443b-8211-6a29b07c91bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.082928 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb2cj\" (UniqueName: \"kubernetes.io/projected/b91e693b-6453-443b-8211-6a29b07c91bd-kube-api-access-zb2cj\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.342746 4675 generic.go:334] "Generic (PLEG): container finished" podID="b91e693b-6453-443b-8211-6a29b07c91bd" containerID="23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4" exitCode=0 Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.342830 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.342824 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" event={"ID":"b91e693b-6453-443b-8211-6a29b07c91bd","Type":"ContainerDied","Data":"23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4"} Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.342970 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ptlpx" event={"ID":"b91e693b-6453-443b-8211-6a29b07c91bd","Type":"ContainerDied","Data":"dfabfb4ed8716099179062a40bfdc8bbe9b9fd6a0320af48b958db24b47bf033"} Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.343007 4675 scope.go:117] "RemoveContainer" containerID="23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.359368 4675 scope.go:117] "RemoveContainer" containerID="b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.372504 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ptlpx"] Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.395407 4675 scope.go:117] "RemoveContainer" containerID="23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4" Mar 20 16:20:59 crc kubenswrapper[4675]: E0320 16:20:59.398062 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4\": container with ID starting with 23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4 not found: ID does not exist" containerID="23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.398107 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4"} err="failed to get container status \"23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4\": rpc error: code = NotFound desc = could not find container \"23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4\": container with ID starting with 23d30f6bea431a960369b1e20a5d30edd47377b9131f46c4f5818840a26f5cc4 not found: ID does not exist" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.398133 4675 scope.go:117] "RemoveContainer" containerID="b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85" Mar 20 16:20:59 crc kubenswrapper[4675]: E0320 16:20:59.398761 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85\": container with ID starting with b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85 not found: ID does not exist" containerID="b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.398801 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85"} err="failed to get container status \"b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85\": rpc error: code = NotFound desc = could not find container \"b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85\": container with ID starting with b3ecbbaa943e4c33cb7631aa438c68ac2e40b8d9b0c1df6b3f5a05c23bcb8d85 not found: ID does not exist" Mar 20 16:20:59 crc kubenswrapper[4675]: I0320 16:20:59.401842 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ptlpx"] Mar 20 16:21:00 crc kubenswrapper[4675]: I0320 16:21:00.685643 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" path="/var/lib/kubelet/pods/b91e693b-6453-443b-8211-6a29b07c91bd/volumes" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.082995 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gvz5x"] Mar 20 16:21:01 crc kubenswrapper[4675]: E0320 16:21:01.083344 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" containerName="dnsmasq-dns" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083365 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" containerName="dnsmasq-dns" Mar 20 16:21:01 crc kubenswrapper[4675]: E0320 16:21:01.083380 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083390 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: E0320 16:21:01.083411 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="dnsmasq-dns" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083418 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="dnsmasq-dns" Mar 20 16:21:01 crc kubenswrapper[4675]: E0320 16:21:01.083426 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083444 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: E0320 16:21:01.083467 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083474 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083629 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f35cdc3f-cbcf-46b5-8988-a077a4e65284" containerName="dnsmasq-dns" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083650 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98f660-63bc-4b67-ba1d-1bb9da72aa6f" containerName="init" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.083665 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91e693b-6453-443b-8211-6a29b07c91bd" containerName="dnsmasq-dns" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.084553 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.106808 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gvz5x"] Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.218922 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.219008 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-config\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.219061 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-dns-svc\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.219099 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.219292 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx5qs\" (UniqueName: \"kubernetes.io/projected/a6f5e724-0428-4860-8b37-7c0517ac755d-kube-api-access-nx5qs\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.320546 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.320629 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-config\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.320714 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-dns-svc\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.320808 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.320883 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx5qs\" (UniqueName: \"kubernetes.io/projected/a6f5e724-0428-4860-8b37-7c0517ac755d-kube-api-access-nx5qs\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.322075 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-dns-svc\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.322438 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.322456 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-config\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.322759 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.341381 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx5qs\" (UniqueName: \"kubernetes.io/projected/a6f5e724-0428-4860-8b37-7c0517ac755d-kube-api-access-nx5qs\") pod \"dnsmasq-dns-698758b865-gvz5x\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.401258 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:01 crc kubenswrapper[4675]: W0320 16:21:01.842459 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f5e724_0428_4860_8b37_7c0517ac755d.slice/crio-4798edb4a968ff249cc94b24245d5d18158314402fbb6d4a5323e8937d2c717d WatchSource:0}: Error finding container 4798edb4a968ff249cc94b24245d5d18158314402fbb6d4a5323e8937d2c717d: Status 404 returned error can't find the container with id 4798edb4a968ff249cc94b24245d5d18158314402fbb6d4a5323e8937d2c717d Mar 20 16:21:01 crc kubenswrapper[4675]: I0320 16:21:01.854815 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gvz5x"] Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.207739 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.213426 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.215566 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.215566 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-khk88" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.215845 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.218294 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.233141 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.338617 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.338947 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c15f64a-1ad0-4072-9f52-3b151c01a21b-lock\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.339074 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.339228 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c15f64a-1ad0-4072-9f52-3b151c01a21b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.339320 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dxc\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-kube-api-access-b4dxc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.339380 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c15f64a-1ad0-4072-9f52-3b151c01a21b-cache\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.369206 4675 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerID="e4fafa668cb38ec178f91b7602896b9cc49b97408a7a24cee7d092a92576bfb6" exitCode=0 Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.369258 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gvz5x" event={"ID":"a6f5e724-0428-4860-8b37-7c0517ac755d","Type":"ContainerDied","Data":"e4fafa668cb38ec178f91b7602896b9cc49b97408a7a24cee7d092a92576bfb6"} Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.369354 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gvz5x" event={"ID":"a6f5e724-0428-4860-8b37-7c0517ac755d","Type":"ContainerStarted","Data":"4798edb4a968ff249cc94b24245d5d18158314402fbb6d4a5323e8937d2c717d"} Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.441103 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c15f64a-1ad0-4072-9f52-3b151c01a21b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.441578 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dxc\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-kube-api-access-b4dxc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.441618 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c15f64a-1ad0-4072-9f52-3b151c01a21b-cache\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.441645 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.441676 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c15f64a-1ad0-4072-9f52-3b151c01a21b-lock\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.441699 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: E0320 16:21:02.441848 4675 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:21:02 crc kubenswrapper[4675]: E0320 16:21:02.441865 4675 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:21:02 crc kubenswrapper[4675]: E0320 16:21:02.441905 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift podName:1c15f64a-1ad0-4072-9f52-3b151c01a21b nodeName:}" failed. No retries permitted until 2026-03-20 16:21:02.941889374 +0000 UTC m=+1182.975518901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift") pod "swift-storage-0" (UID: "1c15f64a-1ad0-4072-9f52-3b151c01a21b") : configmap "swift-ring-files" not found Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.442157 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.442549 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c15f64a-1ad0-4072-9f52-3b151c01a21b-cache\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.445550 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c15f64a-1ad0-4072-9f52-3b151c01a21b-lock\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.446750 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c15f64a-1ad0-4072-9f52-3b151c01a21b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.459731 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dxc\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-kube-api-access-b4dxc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.465962 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.536146 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-g2g29"] Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.537159 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.541411 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.541895 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.542118 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.548964 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-g2g29"] Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.644837 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-etc-swift\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.644894 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-ring-data-devices\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.644926 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-scripts\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.644951 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-combined-ca-bundle\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.644997 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q528s\" (UniqueName: \"kubernetes.io/projected/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-kube-api-access-q528s\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.645028 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-dispersionconf\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.645045 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-swiftconf\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.746608 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q528s\" (UniqueName: \"kubernetes.io/projected/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-kube-api-access-q528s\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.746706 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-dispersionconf\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.746733 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-swiftconf\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.746835 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-etc-swift\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.746896 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-ring-data-devices\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.747084 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-scripts\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.747159 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-combined-ca-bundle\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.748602 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-ring-data-devices\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.749139 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-scripts\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.749684 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-etc-swift\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.751410 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-dispersionconf\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.751909 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-combined-ca-bundle\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.752608 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-swiftconf\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.764523 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q528s\" (UniqueName: \"kubernetes.io/projected/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-kube-api-access-q528s\") pod \"swift-ring-rebalance-g2g29\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.851237 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.888726 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.946590 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 16:21:02 crc kubenswrapper[4675]: I0320 16:21:02.949986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:02 crc kubenswrapper[4675]: E0320 16:21:02.950150 4675 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:21:02 crc kubenswrapper[4675]: E0320 16:21:02.950177 4675 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:21:02 crc kubenswrapper[4675]: E0320 16:21:02.950226 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift podName:1c15f64a-1ad0-4072-9f52-3b151c01a21b nodeName:}" failed. No retries permitted until 2026-03-20 16:21:03.950207467 +0000 UTC m=+1183.983837004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift") pod "swift-storage-0" (UID: "1c15f64a-1ad0-4072-9f52-3b151c01a21b") : configmap "swift-ring-files" not found Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.320336 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-g2g29"] Mar 20 16:21:03 crc kubenswrapper[4675]: W0320 16:21:03.325226 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod011fcaf2_19dd_4b94_98c8_ba1ba81cd656.slice/crio-617e552b04be129bb2cfff9e1dd4c6cbcf28839c45fec748015650d5c675a08f WatchSource:0}: Error finding container 617e552b04be129bb2cfff9e1dd4c6cbcf28839c45fec748015650d5c675a08f: Status 404 returned error can't find the container with id 617e552b04be129bb2cfff9e1dd4c6cbcf28839c45fec748015650d5c675a08f Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.359169 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.390083 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g2g29" event={"ID":"011fcaf2-19dd-4b94-98c8-ba1ba81cd656","Type":"ContainerStarted","Data":"617e552b04be129bb2cfff9e1dd4c6cbcf28839c45fec748015650d5c675a08f"} Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.392155 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gvz5x" event={"ID":"a6f5e724-0428-4860-8b37-7c0517ac755d","Type":"ContainerStarted","Data":"973584f860f0c55b564ad7fd4b53111645f156751d4ae58a9eefc3f8838d78d9"} Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.411418 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gvz5x" podStartSLOduration=2.411402918 podStartE2EDuration="2.411402918s" podCreationTimestamp="2026-03-20 16:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:03.410466671 +0000 UTC m=+1183.444096218" watchObservedRunningTime="2026-03-20 16:21:03.411402918 +0000 UTC m=+1183.445032455" Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.445492 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 16:21:03 crc kubenswrapper[4675]: I0320 16:21:03.965880 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:03 crc kubenswrapper[4675]: E0320 16:21:03.966135 4675 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:21:03 crc kubenswrapper[4675]: E0320 16:21:03.966170 4675 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:21:03 crc kubenswrapper[4675]: E0320 16:21:03.966241 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift podName:1c15f64a-1ad0-4072-9f52-3b151c01a21b nodeName:}" failed. No retries permitted until 2026-03-20 16:21:05.966217884 +0000 UTC m=+1185.999847421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift") pod "swift-storage-0" (UID: "1c15f64a-1ad0-4072-9f52-3b151c01a21b") : configmap "swift-ring-files" not found Mar 20 16:21:04 crc kubenswrapper[4675]: I0320 16:21:04.401744 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:04 crc kubenswrapper[4675]: I0320 16:21:04.425606 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:21:04 crc kubenswrapper[4675]: I0320 16:21:04.425659 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:21:04 crc kubenswrapper[4675]: I0320 16:21:04.425702 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:21:04 crc kubenswrapper[4675]: I0320 16:21:04.426554 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82c9eee9831702a396f1ab945d5ca7dca1e7cbf4d14cc472240a2d6bc5bec93c"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:21:04 crc kubenswrapper[4675]: I0320 16:21:04.426630 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://82c9eee9831702a396f1ab945d5ca7dca1e7cbf4d14cc472240a2d6bc5bec93c" gracePeriod=600 Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.408576 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="82c9eee9831702a396f1ab945d5ca7dca1e7cbf4d14cc472240a2d6bc5bec93c" exitCode=0 Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.408633 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"82c9eee9831702a396f1ab945d5ca7dca1e7cbf4d14cc472240a2d6bc5bec93c"} Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.408985 4675 scope.go:117] "RemoveContainer" containerID="8b8ad200c1fd09c2db80eb419a24ccfd9bc395099eb6158f7f572743729d42ad" Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.410975 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87f2f4be-70c8-409a-8fe8-c753758021f4","Type":"ContainerStarted","Data":"20d1a869ae17f1da58e250b68e2d939922c644729f6168caddf3a38b6b9e7da3"} Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.989072 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4799m"] Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.990574 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4799m" Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.992521 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 16:21:05 crc kubenswrapper[4675]: I0320 16:21:05.999688 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4799m"] Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.000777 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:06 crc kubenswrapper[4675]: E0320 16:21:06.000932 4675 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:21:06 crc kubenswrapper[4675]: E0320 16:21:06.000949 4675 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:21:06 crc kubenswrapper[4675]: E0320 16:21:06.000999 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift podName:1c15f64a-1ad0-4072-9f52-3b151c01a21b nodeName:}" failed. No retries permitted until 2026-03-20 16:21:10.000982954 +0000 UTC m=+1190.034612481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift") pod "swift-storage-0" (UID: "1c15f64a-1ad0-4072-9f52-3b151c01a21b") : configmap "swift-ring-files" not found Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.102753 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndfm\" (UniqueName: \"kubernetes.io/projected/8c16024b-98df-4aba-b1ce-d0b24c62eb18-kube-api-access-cndfm\") pod \"root-account-create-update-4799m\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.102827 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c16024b-98df-4aba-b1ce-d0b24c62eb18-operator-scripts\") pod \"root-account-create-update-4799m\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.204894 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cndfm\" (UniqueName: \"kubernetes.io/projected/8c16024b-98df-4aba-b1ce-d0b24c62eb18-kube-api-access-cndfm\") pod \"root-account-create-update-4799m\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.204946 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c16024b-98df-4aba-b1ce-d0b24c62eb18-operator-scripts\") pod \"root-account-create-update-4799m\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.206576 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c16024b-98df-4aba-b1ce-d0b24c62eb18-operator-scripts\") pod \"root-account-create-update-4799m\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.222220 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndfm\" (UniqueName: \"kubernetes.io/projected/8c16024b-98df-4aba-b1ce-d0b24c62eb18-kube-api-access-cndfm\") pod \"root-account-create-update-4799m\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.317971 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4799m" Mar 20 16:21:06 crc kubenswrapper[4675]: I0320 16:21:06.904308 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4799m"] Mar 20 16:21:06 crc kubenswrapper[4675]: W0320 16:21:06.908024 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c16024b_98df_4aba_b1ce_d0b24c62eb18.slice/crio-898a4d5161e530cc6f115389a4392c35b145706d662b66390daabb9151ec0b24 WatchSource:0}: Error finding container 898a4d5161e530cc6f115389a4392c35b145706d662b66390daabb9151ec0b24: Status 404 returned error can't find the container with id 898a4d5161e530cc6f115389a4392c35b145706d662b66390daabb9151ec0b24 Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.433034 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"7732914d5ec5c37cec22ffa5532f80bae40c4bcbf0ea409824aff0266bbf1edb"} Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.434868 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g2g29" event={"ID":"011fcaf2-19dd-4b94-98c8-ba1ba81cd656","Type":"ContainerStarted","Data":"693f04b7017e0c965fbea364ff3aa97c12ffc773f5d82d986d883ef6877b6ae3"} Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.437428 4675 generic.go:334] "Generic (PLEG): container finished" podID="8c16024b-98df-4aba-b1ce-d0b24c62eb18" containerID="3ad19f1a9a8a7d3ebfb1898bad183c5681bbc9300f4db3816e88fe707b0a7e03" exitCode=0 Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.437707 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4799m" event={"ID":"8c16024b-98df-4aba-b1ce-d0b24c62eb18","Type":"ContainerDied","Data":"3ad19f1a9a8a7d3ebfb1898bad183c5681bbc9300f4db3816e88fe707b0a7e03"} Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.437914 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4799m" event={"ID":"8c16024b-98df-4aba-b1ce-d0b24c62eb18","Type":"ContainerStarted","Data":"898a4d5161e530cc6f115389a4392c35b145706d662b66390daabb9151ec0b24"} Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.439537 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67c80b11-0763-4407-ad6b-5f1fef8ad591","Type":"ContainerStarted","Data":"194bc19b7433175bf4a47dcff825c330ce0b02faca2e3187a5498dd9f798e823"} Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.441302 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8f8e55f-429c-43cf-9aea-9524bf3caac7","Type":"ContainerStarted","Data":"0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327"} Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.441536 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.464080 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.504095 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-g2g29" podStartSLOduration=2.319203147 podStartE2EDuration="5.504064013s" podCreationTimestamp="2026-03-20 16:21:02 +0000 UTC" firstStartedPulling="2026-03-20 16:21:03.327198459 +0000 UTC m=+1183.360827996" lastFinishedPulling="2026-03-20 16:21:06.512059315 +0000 UTC m=+1186.545688862" observedRunningTime="2026-03-20 16:21:07.493982348 +0000 UTC m=+1187.527611885" watchObservedRunningTime="2026-03-20 16:21:07.504064013 +0000 UTC m=+1187.537693590" Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.515960 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.635185487 podStartE2EDuration="43.515937709s" podCreationTimestamp="2026-03-20 16:20:24 +0000 UTC" firstStartedPulling="2026-03-20 16:20:28.637991192 +0000 UTC m=+1148.671620729" lastFinishedPulling="2026-03-20 16:21:06.518743414 +0000 UTC m=+1186.552372951" observedRunningTime="2026-03-20 16:21:07.509860077 +0000 UTC m=+1187.543489614" watchObservedRunningTime="2026-03-20 16:21:07.515937709 +0000 UTC m=+1187.549567256" Mar 20 16:21:07 crc kubenswrapper[4675]: I0320 16:21:07.535522 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.469881633 podStartE2EDuration="47.535504562s" podCreationTimestamp="2026-03-20 16:20:20 +0000 UTC" firstStartedPulling="2026-03-20 16:20:27.449915724 +0000 UTC m=+1147.483545261" lastFinishedPulling="2026-03-20 16:21:06.515538653 +0000 UTC m=+1186.549168190" observedRunningTime="2026-03-20 16:21:07.531275532 +0000 UTC m=+1187.564905069" watchObservedRunningTime="2026-03-20 16:21:07.535504562 +0000 UTC m=+1187.569134099" Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.448438 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2786789-8885-42c4-9127-c0466e2212eb","Type":"ContainerStarted","Data":"1c9261d3e9ae31d8738e1009cecc91ec6036c05963b24fbe82f9ff9760c474b8"} Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.798605 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4799m" Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.848838 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cndfm\" (UniqueName: \"kubernetes.io/projected/8c16024b-98df-4aba-b1ce-d0b24c62eb18-kube-api-access-cndfm\") pod \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.848947 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c16024b-98df-4aba-b1ce-d0b24c62eb18-operator-scripts\") pod \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\" (UID: \"8c16024b-98df-4aba-b1ce-d0b24c62eb18\") " Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.849949 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c16024b-98df-4aba-b1ce-d0b24c62eb18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c16024b-98df-4aba-b1ce-d0b24c62eb18" (UID: "8c16024b-98df-4aba-b1ce-d0b24c62eb18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.856985 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c16024b-98df-4aba-b1ce-d0b24c62eb18-kube-api-access-cndfm" (OuterVolumeSpecName: "kube-api-access-cndfm") pod "8c16024b-98df-4aba-b1ce-d0b24c62eb18" (UID: "8c16024b-98df-4aba-b1ce-d0b24c62eb18"). InnerVolumeSpecName "kube-api-access-cndfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.951372 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c16024b-98df-4aba-b1ce-d0b24c62eb18-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4675]: I0320 16:21:08.951407 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cndfm\" (UniqueName: \"kubernetes.io/projected/8c16024b-98df-4aba-b1ce-d0b24c62eb18-kube-api-access-cndfm\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.216617 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jrt74"] Mar 20 16:21:09 crc kubenswrapper[4675]: E0320 16:21:09.217062 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c16024b-98df-4aba-b1ce-d0b24c62eb18" containerName="mariadb-account-create-update" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.217082 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c16024b-98df-4aba-b1ce-d0b24c62eb18" containerName="mariadb-account-create-update" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.217278 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c16024b-98df-4aba-b1ce-d0b24c62eb18" containerName="mariadb-account-create-update" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.217862 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.226110 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrt74"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.255196 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl9h7\" (UniqueName: \"kubernetes.io/projected/5c7deb55-a7bb-4207-822f-c348a40ee473-kube-api-access-tl9h7\") pod \"glance-db-create-jrt74\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.255278 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7deb55-a7bb-4207-822f-c348a40ee473-operator-scripts\") pod \"glance-db-create-jrt74\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.314739 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1c94-account-create-update-gs974"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.317044 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.318782 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.321950 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1c94-account-create-update-gs974"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.357244 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561932af-1ef9-47ff-9da6-b661477b60ae-operator-scripts\") pod \"glance-1c94-account-create-update-gs974\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.357340 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl9h7\" (UniqueName: \"kubernetes.io/projected/5c7deb55-a7bb-4207-822f-c348a40ee473-kube-api-access-tl9h7\") pod \"glance-db-create-jrt74\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.357368 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7deb55-a7bb-4207-822f-c348a40ee473-operator-scripts\") pod \"glance-db-create-jrt74\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.357433 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7mg4\" (UniqueName: \"kubernetes.io/projected/561932af-1ef9-47ff-9da6-b661477b60ae-kube-api-access-h7mg4\") pod \"glance-1c94-account-create-update-gs974\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.358291 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7deb55-a7bb-4207-822f-c348a40ee473-operator-scripts\") pod \"glance-db-create-jrt74\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.372879 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl9h7\" (UniqueName: \"kubernetes.io/projected/5c7deb55-a7bb-4207-822f-c348a40ee473-kube-api-access-tl9h7\") pod \"glance-db-create-jrt74\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.456645 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4799m" event={"ID":"8c16024b-98df-4aba-b1ce-d0b24c62eb18","Type":"ContainerDied","Data":"898a4d5161e530cc6f115389a4392c35b145706d662b66390daabb9151ec0b24"} Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.456683 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898a4d5161e530cc6f115389a4392c35b145706d662b66390daabb9151ec0b24" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.456748 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4799m" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.458304 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7mg4\" (UniqueName: \"kubernetes.io/projected/561932af-1ef9-47ff-9da6-b661477b60ae-kube-api-access-h7mg4\") pod \"glance-1c94-account-create-update-gs974\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.458346 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561932af-1ef9-47ff-9da6-b661477b60ae-operator-scripts\") pod \"glance-1c94-account-create-update-gs974\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.459007 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561932af-1ef9-47ff-9da6-b661477b60ae-operator-scripts\") pod \"glance-1c94-account-create-update-gs974\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.475640 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7mg4\" (UniqueName: \"kubernetes.io/projected/561932af-1ef9-47ff-9da6-b661477b60ae-kube-api-access-h7mg4\") pod \"glance-1c94-account-create-update-gs974\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.534942 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrt74" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.636819 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.858127 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qwg4d"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.859273 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.873140 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qwg4d"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.969177 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3b23-account-create-update-zhhwz"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.969984 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-operator-scripts\") pod \"keystone-db-create-qwg4d\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.970074 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfprt\" (UniqueName: \"kubernetes.io/projected/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-kube-api-access-zfprt\") pod \"keystone-db-create-qwg4d\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.970671 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.976364 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3b23-account-create-update-zhhwz"] Mar 20 16:21:09 crc kubenswrapper[4675]: I0320 16:21:09.976422 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.021467 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrt74"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.076308 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.076545 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-operator-scripts\") pod \"keystone-db-create-qwg4d\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.076733 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfprt\" (UniqueName: \"kubernetes.io/projected/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-kube-api-access-zfprt\") pod \"keystone-db-create-qwg4d\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.076981 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfrht\" (UniqueName: \"kubernetes.io/projected/b1d96077-f705-44cc-a64d-dd4d7df551a6-kube-api-access-nfrht\") pod \"keystone-3b23-account-create-update-zhhwz\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.077220 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d96077-f705-44cc-a64d-dd4d7df551a6-operator-scripts\") pod \"keystone-3b23-account-create-update-zhhwz\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: E0320 16:21:10.079978 4675 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:21:10 crc kubenswrapper[4675]: E0320 16:21:10.080011 4675 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:21:10 crc kubenswrapper[4675]: E0320 16:21:10.080123 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift podName:1c15f64a-1ad0-4072-9f52-3b151c01a21b nodeName:}" failed. No retries permitted until 2026-03-20 16:21:18.080102137 +0000 UTC m=+1198.113731684 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift") pod "swift-storage-0" (UID: "1c15f64a-1ad0-4072-9f52-3b151c01a21b") : configmap "swift-ring-files" not found Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.083393 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-operator-scripts\") pod \"keystone-db-create-qwg4d\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.105829 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-272fr"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.109129 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.117196 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-272fr"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.122229 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfprt\" (UniqueName: \"kubernetes.io/projected/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-kube-api-access-zfprt\") pod \"keystone-db-create-qwg4d\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.138464 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1c94-account-create-update-gs974"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.165667 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1bbb-account-create-update-g82mn"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.166612 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.168985 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.172580 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1bbb-account-create-update-g82mn"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.180471 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vkdx\" (UniqueName: \"kubernetes.io/projected/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-kube-api-access-5vkdx\") pod \"placement-db-create-272fr\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.180519 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bdnt\" (UniqueName: \"kubernetes.io/projected/5ccf7599-4ec9-4023-b6a4-9517656c82f8-kube-api-access-7bdnt\") pod \"placement-1bbb-account-create-update-g82mn\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.180565 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-operator-scripts\") pod \"placement-db-create-272fr\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.180640 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ccf7599-4ec9-4023-b6a4-9517656c82f8-operator-scripts\") pod \"placement-1bbb-account-create-update-g82mn\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.180666 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfrht\" (UniqueName: \"kubernetes.io/projected/b1d96077-f705-44cc-a64d-dd4d7df551a6-kube-api-access-nfrht\") pod \"keystone-3b23-account-create-update-zhhwz\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.180695 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d96077-f705-44cc-a64d-dd4d7df551a6-operator-scripts\") pod \"keystone-3b23-account-create-update-zhhwz\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.181544 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d96077-f705-44cc-a64d-dd4d7df551a6-operator-scripts\") pod \"keystone-3b23-account-create-update-zhhwz\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.192381 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.201232 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfrht\" (UniqueName: \"kubernetes.io/projected/b1d96077-f705-44cc-a64d-dd4d7df551a6-kube-api-access-nfrht\") pod \"keystone-3b23-account-create-update-zhhwz\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.283524 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vkdx\" (UniqueName: \"kubernetes.io/projected/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-kube-api-access-5vkdx\") pod \"placement-db-create-272fr\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.283890 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bdnt\" (UniqueName: \"kubernetes.io/projected/5ccf7599-4ec9-4023-b6a4-9517656c82f8-kube-api-access-7bdnt\") pod \"placement-1bbb-account-create-update-g82mn\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.283950 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-operator-scripts\") pod \"placement-db-create-272fr\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.284065 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ccf7599-4ec9-4023-b6a4-9517656c82f8-operator-scripts\") pod \"placement-1bbb-account-create-update-g82mn\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.285104 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ccf7599-4ec9-4023-b6a4-9517656c82f8-operator-scripts\") pod \"placement-1bbb-account-create-update-g82mn\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.285107 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-operator-scripts\") pod \"placement-db-create-272fr\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.295423 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.303350 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vkdx\" (UniqueName: \"kubernetes.io/projected/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-kube-api-access-5vkdx\") pod \"placement-db-create-272fr\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.304396 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bdnt\" (UniqueName: \"kubernetes.io/projected/5ccf7599-4ec9-4023-b6a4-9517656c82f8-kube-api-access-7bdnt\") pod \"placement-1bbb-account-create-update-g82mn\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.464250 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.467205 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1c94-account-create-update-gs974" event={"ID":"561932af-1ef9-47ff-9da6-b661477b60ae","Type":"ContainerStarted","Data":"bab0a5aaab6a8957de3ae771bd3b70048a2c91c254d45f62f9ecc0aae826091c"} Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.468063 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrt74" event={"ID":"5c7deb55-a7bb-4207-822f-c348a40ee473","Type":"ContainerStarted","Data":"a9ae15a15d9cc6494344f37d49e19d16e16e36a289f8b90f6f29ae6c894815fc"} Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.513057 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.542608 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-272fr" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.547918 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.625575 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qwg4d"] Mar 20 16:21:10 crc kubenswrapper[4675]: W0320 16:21:10.638131 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aee7127_2ed6_4ece_9eaa_0dfda0be02ad.slice/crio-8b8507be29eb620d3e224783c9d0f89cacae8fa409382f9f2c61a8ba261a4ef9 WatchSource:0}: Error finding container 8b8507be29eb620d3e224783c9d0f89cacae8fa409382f9f2c61a8ba261a4ef9: Status 404 returned error can't find the container with id 8b8507be29eb620d3e224783c9d0f89cacae8fa409382f9f2c61a8ba261a4ef9 Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.750146 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3b23-account-create-update-zhhwz"] Mar 20 16:21:10 crc kubenswrapper[4675]: I0320 16:21:10.998794 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-272fr"] Mar 20 16:21:11 crc kubenswrapper[4675]: W0320 16:21:11.011724 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a8634b_cae4_44ff_b49b_3c2c12ef93fe.slice/crio-8c62699813b2afa7df1d00ac7cbff8bf8ab074e71b792d1fb5c8599e13fb041e WatchSource:0}: Error finding container 8c62699813b2afa7df1d00ac7cbff8bf8ab074e71b792d1fb5c8599e13fb041e: Status 404 returned error can't find the container with id 8c62699813b2afa7df1d00ac7cbff8bf8ab074e71b792d1fb5c8599e13fb041e Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.076975 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1bbb-account-create-update-g82mn"] Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.402877 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.477488 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l56nr"] Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.478242 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerName="dnsmasq-dns" containerID="cri-o://33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626" gracePeriod=10 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.496130 4675 generic.go:334] "Generic (PLEG): container finished" podID="5c7deb55-a7bb-4207-822f-c348a40ee473" containerID="fd6c82753ba044d5aa2f2b747fe88d0159a1d37e2f1fc156cf3becb03ef96b18" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.496218 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrt74" event={"ID":"5c7deb55-a7bb-4207-822f-c348a40ee473","Type":"ContainerDied","Data":"fd6c82753ba044d5aa2f2b747fe88d0159a1d37e2f1fc156cf3becb03ef96b18"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.503984 4675 generic.go:334] "Generic (PLEG): container finished" podID="4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" containerID="1e65565122945947fdb277280a030c7f0fe3a61be1efb56485c1efb8eef89ea9" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.504159 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qwg4d" event={"ID":"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad","Type":"ContainerDied","Data":"1e65565122945947fdb277280a030c7f0fe3a61be1efb56485c1efb8eef89ea9"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.504205 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qwg4d" event={"ID":"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad","Type":"ContainerStarted","Data":"8b8507be29eb620d3e224783c9d0f89cacae8fa409382f9f2c61a8ba261a4ef9"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.511924 4675 generic.go:334] "Generic (PLEG): container finished" podID="5ccf7599-4ec9-4023-b6a4-9517656c82f8" containerID="4bb3915f874f97ba51894484158e8d30440cf0a06fc3802a6cf5b8a6e55bec67" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.512057 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1bbb-account-create-update-g82mn" event={"ID":"5ccf7599-4ec9-4023-b6a4-9517656c82f8","Type":"ContainerDied","Data":"4bb3915f874f97ba51894484158e8d30440cf0a06fc3802a6cf5b8a6e55bec67"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.512093 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1bbb-account-create-update-g82mn" event={"ID":"5ccf7599-4ec9-4023-b6a4-9517656c82f8","Type":"ContainerStarted","Data":"f69c8639f3b4ed92115d6548323a4d009932cff7dd11a8759f25097e38a82df0"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.515303 4675 generic.go:334] "Generic (PLEG): container finished" podID="b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" containerID="6f26dd6172245e083d07812b969d3f493470dca054220bdcc2a5aec2369262d3" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.515367 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-272fr" event={"ID":"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe","Type":"ContainerDied","Data":"6f26dd6172245e083d07812b969d3f493470dca054220bdcc2a5aec2369262d3"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.515391 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-272fr" event={"ID":"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe","Type":"ContainerStarted","Data":"8c62699813b2afa7df1d00ac7cbff8bf8ab074e71b792d1fb5c8599e13fb041e"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.517509 4675 generic.go:334] "Generic (PLEG): container finished" podID="b1d96077-f705-44cc-a64d-dd4d7df551a6" containerID="497f4d7c1119b9973b58b8991ef37976e205dca3b190cd6e110a7aa00c073e7d" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.517613 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3b23-account-create-update-zhhwz" event={"ID":"b1d96077-f705-44cc-a64d-dd4d7df551a6","Type":"ContainerDied","Data":"497f4d7c1119b9973b58b8991ef37976e205dca3b190cd6e110a7aa00c073e7d"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.517635 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3b23-account-create-update-zhhwz" event={"ID":"b1d96077-f705-44cc-a64d-dd4d7df551a6","Type":"ContainerStarted","Data":"db4aa477e78d95a8b7083b3a5394ab78179018a31d5ffe669ba2f8812b1a6e5b"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.518963 4675 generic.go:334] "Generic (PLEG): container finished" podID="561932af-1ef9-47ff-9da6-b661477b60ae" containerID="36ce2698098692684c91e2c881dfc739ce38c9a1f105bee77f181b0ae416ac10" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.519740 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1c94-account-create-update-gs974" event={"ID":"561932af-1ef9-47ff-9da6-b661477b60ae","Type":"ContainerDied","Data":"36ce2698098692684c91e2c881dfc739ce38c9a1f105bee77f181b0ae416ac10"} Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.574027 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.737957 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.740203 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.742064 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4nfwd" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.745028 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.750652 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.750883 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.760269 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918321 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918367 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918425 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2z22\" (UniqueName: \"kubernetes.io/projected/53eea136-525b-482e-99ed-7f280dce9186-kube-api-access-m2z22\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918447 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53eea136-525b-482e-99ed-7f280dce9186-scripts\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918534 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53eea136-525b-482e-99ed-7f280dce9186-config\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918570 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/53eea136-525b-482e-99ed-7f280dce9186-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:11 crc kubenswrapper[4675]: I0320 16:21:11.918593 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.020508 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53eea136-525b-482e-99ed-7f280dce9186-config\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.021997 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/53eea136-525b-482e-99ed-7f280dce9186-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022082 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022137 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53eea136-525b-482e-99ed-7f280dce9186-config\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022321 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022356 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022501 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2z22\" (UniqueName: \"kubernetes.io/projected/53eea136-525b-482e-99ed-7f280dce9186-kube-api-access-m2z22\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022560 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53eea136-525b-482e-99ed-7f280dce9186-scripts\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.022498 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/53eea136-525b-482e-99ed-7f280dce9186-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.023262 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53eea136-525b-482e-99ed-7f280dce9186-scripts\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.030102 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.031650 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.037176 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eea136-525b-482e-99ed-7f280dce9186-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.049034 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2z22\" (UniqueName: \"kubernetes.io/projected/53eea136-525b-482e-99ed-7f280dce9186-kube-api-access-m2z22\") pod \"ovn-northd-0\" (UID: \"53eea136-525b-482e-99ed-7f280dce9186\") " pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.088056 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.160369 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.326348 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-sb\") pod \"d1a33921-a0a7-417f-bcde-05c6af9baa10\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.326414 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-config\") pod \"d1a33921-a0a7-417f-bcde-05c6af9baa10\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.326434 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-nb\") pod \"d1a33921-a0a7-417f-bcde-05c6af9baa10\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.326457 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-dns-svc\") pod \"d1a33921-a0a7-417f-bcde-05c6af9baa10\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.326570 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgkbr\" (UniqueName: \"kubernetes.io/projected/d1a33921-a0a7-417f-bcde-05c6af9baa10-kube-api-access-zgkbr\") pod \"d1a33921-a0a7-417f-bcde-05c6af9baa10\" (UID: \"d1a33921-a0a7-417f-bcde-05c6af9baa10\") " Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.347670 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a33921-a0a7-417f-bcde-05c6af9baa10-kube-api-access-zgkbr" (OuterVolumeSpecName: "kube-api-access-zgkbr") pod "d1a33921-a0a7-417f-bcde-05c6af9baa10" (UID: "d1a33921-a0a7-417f-bcde-05c6af9baa10"). InnerVolumeSpecName "kube-api-access-zgkbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.364034 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1a33921-a0a7-417f-bcde-05c6af9baa10" (UID: "d1a33921-a0a7-417f-bcde-05c6af9baa10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.378021 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1a33921-a0a7-417f-bcde-05c6af9baa10" (UID: "d1a33921-a0a7-417f-bcde-05c6af9baa10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.391155 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-config" (OuterVolumeSpecName: "config") pod "d1a33921-a0a7-417f-bcde-05c6af9baa10" (UID: "d1a33921-a0a7-417f-bcde-05c6af9baa10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.392426 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1a33921-a0a7-417f-bcde-05c6af9baa10" (UID: "d1a33921-a0a7-417f-bcde-05c6af9baa10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.392476 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4799m"] Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.397856 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4799m"] Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.428738 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.428812 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.428824 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.428835 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a33921-a0a7-417f-bcde-05c6af9baa10-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.428848 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgkbr\" (UniqueName: \"kubernetes.io/projected/d1a33921-a0a7-417f-bcde-05c6af9baa10-kube-api-access-zgkbr\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.527898 4675 generic.go:334] "Generic (PLEG): container finished" podID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerID="33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626" exitCode=0 Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.527939 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.527987 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" event={"ID":"d1a33921-a0a7-417f-bcde-05c6af9baa10","Type":"ContainerDied","Data":"33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626"} Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.528029 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l56nr" event={"ID":"d1a33921-a0a7-417f-bcde-05c6af9baa10","Type":"ContainerDied","Data":"7e63d0f0017cd72d4c9121d4bf36436609252f43b64547e513b475937b74316d"} Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.528050 4675 scope.go:117] "RemoveContainer" containerID="33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.555188 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.575663 4675 scope.go:117] "RemoveContainer" containerID="64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.576871 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l56nr"] Mar 20 16:21:12 crc kubenswrapper[4675]: W0320 16:21:12.582450 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53eea136_525b_482e_99ed_7f280dce9186.slice/crio-81c3291bfa4954b4f8ad4bf0f699739a3d5a6c1efa1f8218fd0632b38ec304df WatchSource:0}: Error finding container 81c3291bfa4954b4f8ad4bf0f699739a3d5a6c1efa1f8218fd0632b38ec304df: Status 404 returned error can't find the container with id 81c3291bfa4954b4f8ad4bf0f699739a3d5a6c1efa1f8218fd0632b38ec304df Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.589688 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l56nr"] Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.610758 4675 scope.go:117] "RemoveContainer" containerID="33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626" Mar 20 16:21:12 crc kubenswrapper[4675]: E0320 16:21:12.611671 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626\": container with ID starting with 33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626 not found: ID does not exist" containerID="33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.611724 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626"} err="failed to get container status \"33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626\": rpc error: code = NotFound desc = could not find container \"33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626\": container with ID starting with 33fd0716bf7fd979590e51b1ca3fcd5d71ca2af778ea0c283e90ed8a5a1a0626 not found: ID does not exist" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.611758 4675 scope.go:117] "RemoveContainer" containerID="64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa" Mar 20 16:21:12 crc kubenswrapper[4675]: E0320 16:21:12.612068 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa\": container with ID starting with 64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa not found: ID does not exist" containerID="64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.612091 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa"} err="failed to get container status \"64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa\": rpc error: code = NotFound desc = could not find container \"64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa\": container with ID starting with 64917783a5744abab3c2dea4c4343ff8c9ee04de50e654e137f60265d977f2aa not found: ID does not exist" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.687082 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c16024b-98df-4aba-b1ce-d0b24c62eb18" path="/var/lib/kubelet/pods/8c16024b-98df-4aba-b1ce-d0b24c62eb18/volumes" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.687676 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" path="/var/lib/kubelet/pods/d1a33921-a0a7-417f-bcde-05c6af9baa10/volumes" Mar 20 16:21:12 crc kubenswrapper[4675]: I0320 16:21:12.961746 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.155536 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d96077-f705-44cc-a64d-dd4d7df551a6-operator-scripts\") pod \"b1d96077-f705-44cc-a64d-dd4d7df551a6\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.155634 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfrht\" (UniqueName: \"kubernetes.io/projected/b1d96077-f705-44cc-a64d-dd4d7df551a6-kube-api-access-nfrht\") pod \"b1d96077-f705-44cc-a64d-dd4d7df551a6\" (UID: \"b1d96077-f705-44cc-a64d-dd4d7df551a6\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.156816 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d96077-f705-44cc-a64d-dd4d7df551a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1d96077-f705-44cc-a64d-dd4d7df551a6" (UID: "b1d96077-f705-44cc-a64d-dd4d7df551a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.160174 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d96077-f705-44cc-a64d-dd4d7df551a6-kube-api-access-nfrht" (OuterVolumeSpecName: "kube-api-access-nfrht") pod "b1d96077-f705-44cc-a64d-dd4d7df551a6" (UID: "b1d96077-f705-44cc-a64d-dd4d7df551a6"). InnerVolumeSpecName "kube-api-access-nfrht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.194198 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrt74" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.209063 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.216751 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-272fr" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.226703 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.237737 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.257717 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d96077-f705-44cc-a64d-dd4d7df551a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.257746 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfrht\" (UniqueName: \"kubernetes.io/projected/b1d96077-f705-44cc-a64d-dd4d7df551a6-kube-api-access-nfrht\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359218 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7deb55-a7bb-4207-822f-c348a40ee473-operator-scripts\") pod \"5c7deb55-a7bb-4207-822f-c348a40ee473\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359257 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vkdx\" (UniqueName: \"kubernetes.io/projected/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-kube-api-access-5vkdx\") pod \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359292 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7mg4\" (UniqueName: \"kubernetes.io/projected/561932af-1ef9-47ff-9da6-b661477b60ae-kube-api-access-h7mg4\") pod \"561932af-1ef9-47ff-9da6-b661477b60ae\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359348 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561932af-1ef9-47ff-9da6-b661477b60ae-operator-scripts\") pod \"561932af-1ef9-47ff-9da6-b661477b60ae\" (UID: \"561932af-1ef9-47ff-9da6-b661477b60ae\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359366 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-operator-scripts\") pod \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359430 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfprt\" (UniqueName: \"kubernetes.io/projected/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-kube-api-access-zfprt\") pod \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\" (UID: \"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359462 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ccf7599-4ec9-4023-b6a4-9517656c82f8-operator-scripts\") pod \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359520 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-operator-scripts\") pod \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\" (UID: \"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359553 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl9h7\" (UniqueName: \"kubernetes.io/projected/5c7deb55-a7bb-4207-822f-c348a40ee473-kube-api-access-tl9h7\") pod \"5c7deb55-a7bb-4207-822f-c348a40ee473\" (UID: \"5c7deb55-a7bb-4207-822f-c348a40ee473\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.359575 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bdnt\" (UniqueName: \"kubernetes.io/projected/5ccf7599-4ec9-4023-b6a4-9517656c82f8-kube-api-access-7bdnt\") pod \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\" (UID: \"5ccf7599-4ec9-4023-b6a4-9517656c82f8\") " Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.360300 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" (UID: "4aee7127-2ed6-4ece-9eaa-0dfda0be02ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.360911 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561932af-1ef9-47ff-9da6-b661477b60ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "561932af-1ef9-47ff-9da6-b661477b60ae" (UID: "561932af-1ef9-47ff-9da6-b661477b60ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.360992 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" (UID: "b5a8634b-cae4-44ff-b49b-3c2c12ef93fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.361104 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ccf7599-4ec9-4023-b6a4-9517656c82f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ccf7599-4ec9-4023-b6a4-9517656c82f8" (UID: "5ccf7599-4ec9-4023-b6a4-9517656c82f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.361243 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7deb55-a7bb-4207-822f-c348a40ee473-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c7deb55-a7bb-4207-822f-c348a40ee473" (UID: "5c7deb55-a7bb-4207-822f-c348a40ee473"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.363843 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-kube-api-access-zfprt" (OuterVolumeSpecName: "kube-api-access-zfprt") pod "4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" (UID: "4aee7127-2ed6-4ece-9eaa-0dfda0be02ad"). InnerVolumeSpecName "kube-api-access-zfprt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.364269 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561932af-1ef9-47ff-9da6-b661477b60ae-kube-api-access-h7mg4" (OuterVolumeSpecName: "kube-api-access-h7mg4") pod "561932af-1ef9-47ff-9da6-b661477b60ae" (UID: "561932af-1ef9-47ff-9da6-b661477b60ae"). InnerVolumeSpecName "kube-api-access-h7mg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.364333 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ccf7599-4ec9-4023-b6a4-9517656c82f8-kube-api-access-7bdnt" (OuterVolumeSpecName: "kube-api-access-7bdnt") pod "5ccf7599-4ec9-4023-b6a4-9517656c82f8" (UID: "5ccf7599-4ec9-4023-b6a4-9517656c82f8"). InnerVolumeSpecName "kube-api-access-7bdnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.364623 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-kube-api-access-5vkdx" (OuterVolumeSpecName: "kube-api-access-5vkdx") pod "b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" (UID: "b5a8634b-cae4-44ff-b49b-3c2c12ef93fe"). InnerVolumeSpecName "kube-api-access-5vkdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.365053 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7deb55-a7bb-4207-822f-c348a40ee473-kube-api-access-tl9h7" (OuterVolumeSpecName: "kube-api-access-tl9h7") pod "5c7deb55-a7bb-4207-822f-c348a40ee473" (UID: "5c7deb55-a7bb-4207-822f-c348a40ee473"). InnerVolumeSpecName "kube-api-access-tl9h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.460975 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561932af-1ef9-47ff-9da6-b661477b60ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461017 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461034 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfprt\" (UniqueName: \"kubernetes.io/projected/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad-kube-api-access-zfprt\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461046 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ccf7599-4ec9-4023-b6a4-9517656c82f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461055 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461064 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl9h7\" (UniqueName: \"kubernetes.io/projected/5c7deb55-a7bb-4207-822f-c348a40ee473-kube-api-access-tl9h7\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461073 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bdnt\" (UniqueName: \"kubernetes.io/projected/5ccf7599-4ec9-4023-b6a4-9517656c82f8-kube-api-access-7bdnt\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461081 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7deb55-a7bb-4207-822f-c348a40ee473-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461089 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vkdx\" (UniqueName: \"kubernetes.io/projected/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe-kube-api-access-5vkdx\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.461097 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7mg4\" (UniqueName: \"kubernetes.io/projected/561932af-1ef9-47ff-9da6-b661477b60ae-kube-api-access-h7mg4\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.537133 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1bbb-account-create-update-g82mn" event={"ID":"5ccf7599-4ec9-4023-b6a4-9517656c82f8","Type":"ContainerDied","Data":"f69c8639f3b4ed92115d6548323a4d009932cff7dd11a8759f25097e38a82df0"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.537172 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69c8639f3b4ed92115d6548323a4d009932cff7dd11a8759f25097e38a82df0" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.537171 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1bbb-account-create-update-g82mn" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.540190 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-272fr" event={"ID":"b5a8634b-cae4-44ff-b49b-3c2c12ef93fe","Type":"ContainerDied","Data":"8c62699813b2afa7df1d00ac7cbff8bf8ab074e71b792d1fb5c8599e13fb041e"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.540209 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-272fr" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.540222 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c62699813b2afa7df1d00ac7cbff8bf8ab074e71b792d1fb5c8599e13fb041e" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.541408 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3b23-account-create-update-zhhwz" event={"ID":"b1d96077-f705-44cc-a64d-dd4d7df551a6","Type":"ContainerDied","Data":"db4aa477e78d95a8b7083b3a5394ab78179018a31d5ffe669ba2f8812b1a6e5b"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.541436 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db4aa477e78d95a8b7083b3a5394ab78179018a31d5ffe669ba2f8812b1a6e5b" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.541500 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3b23-account-create-update-zhhwz" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.544733 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1c94-account-create-update-gs974" event={"ID":"561932af-1ef9-47ff-9da6-b661477b60ae","Type":"ContainerDied","Data":"bab0a5aaab6a8957de3ae771bd3b70048a2c91c254d45f62f9ecc0aae826091c"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.544802 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab0a5aaab6a8957de3ae771bd3b70048a2c91c254d45f62f9ecc0aae826091c" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.544813 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c94-account-create-update-gs974" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.549062 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrt74" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.549059 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrt74" event={"ID":"5c7deb55-a7bb-4207-822f-c348a40ee473","Type":"ContainerDied","Data":"a9ae15a15d9cc6494344f37d49e19d16e16e36a289f8b90f6f29ae6c894815fc"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.549106 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ae15a15d9cc6494344f37d49e19d16e16e36a289f8b90f6f29ae6c894815fc" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.550245 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"53eea136-525b-482e-99ed-7f280dce9186","Type":"ContainerStarted","Data":"81c3291bfa4954b4f8ad4bf0f699739a3d5a6c1efa1f8218fd0632b38ec304df"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.552163 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qwg4d" event={"ID":"4aee7127-2ed6-4ece-9eaa-0dfda0be02ad","Type":"ContainerDied","Data":"8b8507be29eb620d3e224783c9d0f89cacae8fa409382f9f2c61a8ba261a4ef9"} Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.552231 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b8507be29eb620d3e224783c9d0f89cacae8fa409382f9f2c61a8ba261a4ef9" Mar 20 16:21:13 crc kubenswrapper[4675]: I0320 16:21:13.552281 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwg4d" Mar 20 16:21:14 crc kubenswrapper[4675]: I0320 16:21:14.560941 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"53eea136-525b-482e-99ed-7f280dce9186","Type":"ContainerStarted","Data":"41a3f59dc750f1e2c9e723136d4fa9ee0c0208558a10042d8d1756d8e7c80777"} Mar 20 16:21:14 crc kubenswrapper[4675]: I0320 16:21:14.560990 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"53eea136-525b-482e-99ed-7f280dce9186","Type":"ContainerStarted","Data":"643dc1740df65d69f2bc1c796a79ecabe9b439b0dfe04fed6651863d30d512b6"} Mar 20 16:21:14 crc kubenswrapper[4675]: I0320 16:21:14.561098 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 16:21:14 crc kubenswrapper[4675]: I0320 16:21:14.562566 4675 generic.go:334] "Generic (PLEG): container finished" podID="011fcaf2-19dd-4b94-98c8-ba1ba81cd656" containerID="693f04b7017e0c965fbea364ff3aa97c12ffc773f5d82d986d883ef6877b6ae3" exitCode=0 Mar 20 16:21:14 crc kubenswrapper[4675]: I0320 16:21:14.562599 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g2g29" event={"ID":"011fcaf2-19dd-4b94-98c8-ba1ba81cd656","Type":"ContainerDied","Data":"693f04b7017e0c965fbea364ff3aa97c12ffc773f5d82d986d883ef6877b6ae3"} Mar 20 16:21:14 crc kubenswrapper[4675]: I0320 16:21:14.591798 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.360762589 podStartE2EDuration="3.591764941s" podCreationTimestamp="2026-03-20 16:21:11 +0000 UTC" firstStartedPulling="2026-03-20 16:21:12.589835157 +0000 UTC m=+1192.623464694" lastFinishedPulling="2026-03-20 16:21:13.820837509 +0000 UTC m=+1193.854467046" observedRunningTime="2026-03-20 16:21:14.585558226 +0000 UTC m=+1194.619187813" watchObservedRunningTime="2026-03-20 16:21:14.591764941 +0000 UTC m=+1194.625394498" Mar 20 16:21:15 crc kubenswrapper[4675]: I0320 16:21:15.912618 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101606 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-scripts\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101654 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-ring-data-devices\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101699 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-swiftconf\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101719 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-combined-ca-bundle\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101759 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-dispersionconf\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101800 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q528s\" (UniqueName: \"kubernetes.io/projected/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-kube-api-access-q528s\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.101838 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-etc-swift\") pod \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\" (UID: \"011fcaf2-19dd-4b94-98c8-ba1ba81cd656\") " Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.102878 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.103128 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.107017 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-kube-api-access-q528s" (OuterVolumeSpecName: "kube-api-access-q528s") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "kube-api-access-q528s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.111466 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.124070 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-scripts" (OuterVolumeSpecName: "scripts") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.125214 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.125397 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "011fcaf2-19dd-4b94-98c8-ba1ba81cd656" (UID: "011fcaf2-19dd-4b94-98c8-ba1ba81cd656"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203030 4675 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203245 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203255 4675 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203268 4675 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203277 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203284 4675 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.203292 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q528s\" (UniqueName: \"kubernetes.io/projected/011fcaf2-19dd-4b94-98c8-ba1ba81cd656-kube-api-access-q528s\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.580527 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g2g29" event={"ID":"011fcaf2-19dd-4b94-98c8-ba1ba81cd656","Type":"ContainerDied","Data":"617e552b04be129bb2cfff9e1dd4c6cbcf28839c45fec748015650d5c675a08f"} Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.580569 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="617e552b04be129bb2cfff9e1dd4c6cbcf28839c45fec748015650d5c675a08f" Mar 20 16:21:16 crc kubenswrapper[4675]: I0320 16:21:16.580639 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g2g29" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.390987 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zdp9x"] Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391312 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7deb55-a7bb-4207-822f-c348a40ee473" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391326 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7deb55-a7bb-4207-822f-c348a40ee473" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391356 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391364 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391376 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccf7599-4ec9-4023-b6a4-9517656c82f8" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391383 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccf7599-4ec9-4023-b6a4-9517656c82f8" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391399 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561932af-1ef9-47ff-9da6-b661477b60ae" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391409 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="561932af-1ef9-47ff-9da6-b661477b60ae" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391432 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391439 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391453 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerName="init" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391460 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerName="init" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391469 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d96077-f705-44cc-a64d-dd4d7df551a6" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391476 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d96077-f705-44cc-a64d-dd4d7df551a6" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391491 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011fcaf2-19dd-4b94-98c8-ba1ba81cd656" containerName="swift-ring-rebalance" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391498 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="011fcaf2-19dd-4b94-98c8-ba1ba81cd656" containerName="swift-ring-rebalance" Mar 20 16:21:17 crc kubenswrapper[4675]: E0320 16:21:17.391507 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerName="dnsmasq-dns" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391512 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerName="dnsmasq-dns" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391652 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7deb55-a7bb-4207-822f-c348a40ee473" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391663 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ccf7599-4ec9-4023-b6a4-9517656c82f8" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391676 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="011fcaf2-19dd-4b94-98c8-ba1ba81cd656" containerName="swift-ring-rebalance" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391683 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a33921-a0a7-417f-bcde-05c6af9baa10" containerName="dnsmasq-dns" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391692 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391702 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" containerName="mariadb-database-create" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391714 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d96077-f705-44cc-a64d-dd4d7df551a6" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.391720 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="561932af-1ef9-47ff-9da6-b661477b60ae" containerName="mariadb-account-create-update" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.392207 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.395033 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.404118 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zdp9x"] Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.420345 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8txv8\" (UniqueName: \"kubernetes.io/projected/36a4f6bb-496f-4c50-9047-057827aefe77-kube-api-access-8txv8\") pod \"root-account-create-update-zdp9x\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.420456 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a4f6bb-496f-4c50-9047-057827aefe77-operator-scripts\") pod \"root-account-create-update-zdp9x\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.521496 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a4f6bb-496f-4c50-9047-057827aefe77-operator-scripts\") pod \"root-account-create-update-zdp9x\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.521844 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8txv8\" (UniqueName: \"kubernetes.io/projected/36a4f6bb-496f-4c50-9047-057827aefe77-kube-api-access-8txv8\") pod \"root-account-create-update-zdp9x\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.522550 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a4f6bb-496f-4c50-9047-057827aefe77-operator-scripts\") pod \"root-account-create-update-zdp9x\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.539043 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8txv8\" (UniqueName: \"kubernetes.io/projected/36a4f6bb-496f-4c50-9047-057827aefe77-kube-api-access-8txv8\") pod \"root-account-create-update-zdp9x\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:17 crc kubenswrapper[4675]: I0320 16:21:17.708425 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.132742 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.153396 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c15f64a-1ad0-4072-9f52-3b151c01a21b-etc-swift\") pod \"swift-storage-0\" (UID: \"1c15f64a-1ad0-4072-9f52-3b151c01a21b\") " pod="openstack/swift-storage-0" Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.179193 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zdp9x"] Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.428201 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.597409 4675 generic.go:334] "Generic (PLEG): container finished" podID="36a4f6bb-496f-4c50-9047-057827aefe77" containerID="f7f5d86c4305bed12ce387b9b14b043be57f9c9d26cfbfe21b72915cc0bf1d29" exitCode=0 Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.597447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zdp9x" event={"ID":"36a4f6bb-496f-4c50-9047-057827aefe77","Type":"ContainerDied","Data":"f7f5d86c4305bed12ce387b9b14b043be57f9c9d26cfbfe21b72915cc0bf1d29"} Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.597470 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zdp9x" event={"ID":"36a4f6bb-496f-4c50-9047-057827aefe77","Type":"ContainerStarted","Data":"05df19a05870cdfaaa6e1edc1316bf4616a2a6a307f136bd5add9d27dd8802e9"} Mar 20 16:21:18 crc kubenswrapper[4675]: W0320 16:21:18.925018 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c15f64a_1ad0_4072_9f52_3b151c01a21b.slice/crio-35fdb0deb7e8698ff8d4e276d86e479dd856a6c31c85e17f5caa6e48d0b92e37 WatchSource:0}: Error finding container 35fdb0deb7e8698ff8d4e276d86e479dd856a6c31c85e17f5caa6e48d0b92e37: Status 404 returned error can't find the container with id 35fdb0deb7e8698ff8d4e276d86e479dd856a6c31c85e17f5caa6e48d0b92e37 Mar 20 16:21:18 crc kubenswrapper[4675]: I0320 16:21:18.927547 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.367175 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ddcp2"] Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.368711 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.371081 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.371097 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-859cw" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.385213 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ddcp2"] Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.461443 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-config-data\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.461514 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt5gc\" (UniqueName: \"kubernetes.io/projected/97e8338f-ae50-4341-aba0-91bf9890a9bc-kube-api-access-wt5gc\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.461589 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-db-sync-config-data\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.461628 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-combined-ca-bundle\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.563152 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-db-sync-config-data\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.563224 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-combined-ca-bundle\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.563294 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-config-data\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.563324 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt5gc\" (UniqueName: \"kubernetes.io/projected/97e8338f-ae50-4341-aba0-91bf9890a9bc-kube-api-access-wt5gc\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.572664 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-db-sync-config-data\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.572692 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-combined-ca-bundle\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.575565 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-config-data\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.580641 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt5gc\" (UniqueName: \"kubernetes.io/projected/97e8338f-ae50-4341-aba0-91bf9890a9bc-kube-api-access-wt5gc\") pod \"glance-db-sync-ddcp2\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.609862 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"35fdb0deb7e8698ff8d4e276d86e479dd856a6c31c85e17f5caa6e48d0b92e37"} Mar 20 16:21:19 crc kubenswrapper[4675]: I0320 16:21:19.683868 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.032967 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.077302 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8txv8\" (UniqueName: \"kubernetes.io/projected/36a4f6bb-496f-4c50-9047-057827aefe77-kube-api-access-8txv8\") pod \"36a4f6bb-496f-4c50-9047-057827aefe77\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.077409 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a4f6bb-496f-4c50-9047-057827aefe77-operator-scripts\") pod \"36a4f6bb-496f-4c50-9047-057827aefe77\" (UID: \"36a4f6bb-496f-4c50-9047-057827aefe77\") " Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.078554 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a4f6bb-496f-4c50-9047-057827aefe77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36a4f6bb-496f-4c50-9047-057827aefe77" (UID: "36a4f6bb-496f-4c50-9047-057827aefe77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.079044 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36a4f6bb-496f-4c50-9047-057827aefe77-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.083896 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a4f6bb-496f-4c50-9047-057827aefe77-kube-api-access-8txv8" (OuterVolumeSpecName: "kube-api-access-8txv8") pod "36a4f6bb-496f-4c50-9047-057827aefe77" (UID: "36a4f6bb-496f-4c50-9047-057827aefe77"). InnerVolumeSpecName "kube-api-access-8txv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.181676 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8txv8\" (UniqueName: \"kubernetes.io/projected/36a4f6bb-496f-4c50-9047-057827aefe77-kube-api-access-8txv8\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.279252 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ddcp2"] Mar 20 16:21:20 crc kubenswrapper[4675]: W0320 16:21:20.280540 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97e8338f_ae50_4341_aba0_91bf9890a9bc.slice/crio-cd4a87921b110e2421fb3d7efa9aed8a829a75f10dc9aa955c2c5ef75cde3143 WatchSource:0}: Error finding container cd4a87921b110e2421fb3d7efa9aed8a829a75f10dc9aa955c2c5ef75cde3143: Status 404 returned error can't find the container with id cd4a87921b110e2421fb3d7efa9aed8a829a75f10dc9aa955c2c5ef75cde3143 Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.629252 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ddcp2" event={"ID":"97e8338f-ae50-4341-aba0-91bf9890a9bc","Type":"ContainerStarted","Data":"cd4a87921b110e2421fb3d7efa9aed8a829a75f10dc9aa955c2c5ef75cde3143"} Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.633369 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"c0930abde265149009c5f72d64dd9d0aac5c6d499f27131d2e99847caaf48a13"} Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.633412 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"e4476b8453ed90e72f7dfede82dcf3ae87feaa93dd12398f3e6019c638e14173"} Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.633422 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"4b3205b8415484cbbdf1d4bf2859269635fda36af6c81a2e888895e442d7fa51"} Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.635117 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zdp9x" event={"ID":"36a4f6bb-496f-4c50-9047-057827aefe77","Type":"ContainerDied","Data":"05df19a05870cdfaaa6e1edc1316bf4616a2a6a307f136bd5add9d27dd8802e9"} Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.635135 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05df19a05870cdfaaa6e1edc1316bf4616a2a6a307f136bd5add9d27dd8802e9" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.635187 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zdp9x" Mar 20 16:21:20 crc kubenswrapper[4675]: I0320 16:21:20.984508 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 16:21:22 crc kubenswrapper[4675]: I0320 16:21:22.654004 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"f87bbae0569e22446b9cffb2c62a79f6d364b70b057a8a3e8d7d45cb638d6bc7"} Mar 20 16:21:23 crc kubenswrapper[4675]: I0320 16:21:23.671164 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"ecd1ccfbbeed682377fd60af8f24861f7cb3e96fbeb662f0a44d36c4cb497ced"} Mar 20 16:21:23 crc kubenswrapper[4675]: I0320 16:21:23.671447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"0b1f4c485fd8121c319799f2974cc0cfd3ca75f57b39b041c9899dec65360c09"} Mar 20 16:21:23 crc kubenswrapper[4675]: I0320 16:21:23.671461 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"b13e0719d8ff29eb69ce032603e984a0b8db2f940a603cbd7219fe10b85829a4"} Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.649560 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-b5cf6" podUID="b563a826-d7ed-453e-89f6-aec33699291e" containerName="ovn-controller" probeResult="failure" output=< Mar 20 16:21:24 crc kubenswrapper[4675]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 16:21:24 crc kubenswrapper[4675]: > Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.693980 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"5af90cccb045596463a904a276977d0c0b4f5b955b3afa2cf7fea1fa370c227b"} Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.756511 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.756890 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-prjxt" Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.976400 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b5cf6-config-wmzd6"] Mar 20 16:21:24 crc kubenswrapper[4675]: E0320 16:21:24.978554 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a4f6bb-496f-4c50-9047-057827aefe77" containerName="mariadb-account-create-update" Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.978577 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a4f6bb-496f-4c50-9047-057827aefe77" containerName="mariadb-account-create-update" Mar 20 16:21:24 crc kubenswrapper[4675]: I0320 16:21:24.978823 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a4f6bb-496f-4c50-9047-057827aefe77" containerName="mariadb-account-create-update" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:24.981046 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.039261 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b5cf6-config-wmzd6"] Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.039350 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.097288 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdb2\" (UniqueName: \"kubernetes.io/projected/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-kube-api-access-nhdb2\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.097443 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run-ovn\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.097608 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-additional-scripts\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.097802 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.097925 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-log-ovn\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.097970 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-scripts\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200210 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-log-ovn\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200555 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-scripts\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200605 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdb2\" (UniqueName: \"kubernetes.io/projected/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-kube-api-access-nhdb2\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200649 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-log-ovn\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200669 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run-ovn\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200734 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run-ovn\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.200966 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-additional-scripts\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.201027 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.201255 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.201652 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-additional-scripts\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.203181 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-scripts\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.222614 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdb2\" (UniqueName: \"kubernetes.io/projected/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-kube-api-access-nhdb2\") pod \"ovn-controller-b5cf6-config-wmzd6\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.384982 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.724957 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"e8d6520b1b1c0158ccf0beb951f507269b1dadd7ae9803fb6578af39e1cd6e26"} Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.725339 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"46b0a54a25b34eae191198eee85521d0f47a492cf828e0b149726e0c1dd58696"} Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.725356 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"3645d8b27d150303ed37052241041de59e372df3b58d9b95406ed44f2012c8d8"} Mar 20 16:21:25 crc kubenswrapper[4675]: I0320 16:21:25.950994 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b5cf6-config-wmzd6"] Mar 20 16:21:26 crc kubenswrapper[4675]: I0320 16:21:26.738977 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6-config-wmzd6" event={"ID":"3a948fc6-1260-482f-a2c5-1c7a3f0b875e","Type":"ContainerStarted","Data":"b055bd7c9ae18ae8c9597f5d193639f5b94016bbf1cf52348ab0635f3724a027"} Mar 20 16:21:26 crc kubenswrapper[4675]: I0320 16:21:26.746045 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"441b655a2b3cdf7baa8e7c48a89cdec8f13b7b7729fcfd1a46623e045751354b"} Mar 20 16:21:26 crc kubenswrapper[4675]: I0320 16:21:26.746093 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"a55556263e9078628c126741658854446ebeec97ab007118f1c0402a34284f6e"} Mar 20 16:21:27 crc kubenswrapper[4675]: I0320 16:21:27.754995 4675 generic.go:334] "Generic (PLEG): container finished" podID="3a948fc6-1260-482f-a2c5-1c7a3f0b875e" containerID="e2062d5335da5b6ff70f634814cdf8afc41192c45da4e6d162288acbf1b264c7" exitCode=0 Mar 20 16:21:27 crc kubenswrapper[4675]: I0320 16:21:27.755099 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6-config-wmzd6" event={"ID":"3a948fc6-1260-482f-a2c5-1c7a3f0b875e","Type":"ContainerDied","Data":"e2062d5335da5b6ff70f634814cdf8afc41192c45da4e6d162288acbf1b264c7"} Mar 20 16:21:27 crc kubenswrapper[4675]: I0320 16:21:27.763236 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"93666542e1ef2159976be6ae4b5ae1737dc83e338298f89e78dc416ea2d32a24"} Mar 20 16:21:27 crc kubenswrapper[4675]: I0320 16:21:27.763277 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1c15f64a-1ad0-4072-9f52-3b151c01a21b","Type":"ContainerStarted","Data":"06d6f9242d8b323471445826d3a8bb87c024dcb3b72c30827918a753a6c90e76"} Mar 20 16:21:27 crc kubenswrapper[4675]: I0320 16:21:27.809079 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.853972149 podStartE2EDuration="26.809054405s" podCreationTimestamp="2026-03-20 16:21:01 +0000 UTC" firstStartedPulling="2026-03-20 16:21:18.926898837 +0000 UTC m=+1198.960528374" lastFinishedPulling="2026-03-20 16:21:24.881981083 +0000 UTC m=+1204.915610630" observedRunningTime="2026-03-20 16:21:27.803978122 +0000 UTC m=+1207.837607679" watchObservedRunningTime="2026-03-20 16:21:27.809054405 +0000 UTC m=+1207.842683942" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.133950 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-rn9sn"] Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.135477 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.137482 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.152079 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-rn9sn"] Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.254474 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-config\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.254546 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.254753 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.254849 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.255093 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.255205 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jtw\" (UniqueName: \"kubernetes.io/projected/f84f49f5-4850-4358-8bc7-5287adddbe4b-kube-api-access-48jtw\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.356545 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.356613 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jtw\" (UniqueName: \"kubernetes.io/projected/f84f49f5-4850-4358-8bc7-5287adddbe4b-kube-api-access-48jtw\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.356645 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-config\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.356673 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.356733 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.356761 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.357723 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.357727 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.357727 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-config\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.357886 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.357938 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.375101 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jtw\" (UniqueName: \"kubernetes.io/projected/f84f49f5-4850-4358-8bc7-5287adddbe4b-kube-api-access-48jtw\") pod \"dnsmasq-dns-77585f5f8c-rn9sn\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.460638 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:28 crc kubenswrapper[4675]: I0320 16:21:28.904525 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-rn9sn"] Mar 20 16:21:28 crc kubenswrapper[4675]: W0320 16:21:28.915845 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf84f49f5_4850_4358_8bc7_5287adddbe4b.slice/crio-fe557ddca785e369a8d21a4ae274db10c895a149476fbc7dcd05eb95d0035bff WatchSource:0}: Error finding container fe557ddca785e369a8d21a4ae274db10c895a149476fbc7dcd05eb95d0035bff: Status 404 returned error can't find the container with id fe557ddca785e369a8d21a4ae274db10c895a149476fbc7dcd05eb95d0035bff Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.055287 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.066841 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhdb2\" (UniqueName: \"kubernetes.io/projected/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-kube-api-access-nhdb2\") pod \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.066898 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-log-ovn\") pod \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.066917 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run\") pod \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.066931 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run-ovn\") pod \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.067026 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-additional-scripts\") pod \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.067098 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-scripts\") pod \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\" (UID: \"3a948fc6-1260-482f-a2c5-1c7a3f0b875e\") " Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.067739 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3a948fc6-1260-482f-a2c5-1c7a3f0b875e" (UID: "3a948fc6-1260-482f-a2c5-1c7a3f0b875e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.067927 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run" (OuterVolumeSpecName: "var-run") pod "3a948fc6-1260-482f-a2c5-1c7a3f0b875e" (UID: "3a948fc6-1260-482f-a2c5-1c7a3f0b875e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.068397 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3a948fc6-1260-482f-a2c5-1c7a3f0b875e" (UID: "3a948fc6-1260-482f-a2c5-1c7a3f0b875e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.068540 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3a948fc6-1260-482f-a2c5-1c7a3f0b875e" (UID: "3a948fc6-1260-482f-a2c5-1c7a3f0b875e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.068908 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-scripts" (OuterVolumeSpecName: "scripts") pod "3a948fc6-1260-482f-a2c5-1c7a3f0b875e" (UID: "3a948fc6-1260-482f-a2c5-1c7a3f0b875e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.071485 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-kube-api-access-nhdb2" (OuterVolumeSpecName: "kube-api-access-nhdb2") pod "3a948fc6-1260-482f-a2c5-1c7a3f0b875e" (UID: "3a948fc6-1260-482f-a2c5-1c7a3f0b875e"). InnerVolumeSpecName "kube-api-access-nhdb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.168139 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhdb2\" (UniqueName: \"kubernetes.io/projected/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-kube-api-access-nhdb2\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.168183 4675 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.168220 4675 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.168234 4675 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.168246 4675 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.168258 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a948fc6-1260-482f-a2c5-1c7a3f0b875e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.650874 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-b5cf6" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.781560 4675 generic.go:334] "Generic (PLEG): container finished" podID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerID="50c90cc3fc356de4fd7e736202db2e77355f9b94e4bf3e1b38ed2d40abde1911" exitCode=0 Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.781655 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" event={"ID":"f84f49f5-4850-4358-8bc7-5287adddbe4b","Type":"ContainerDied","Data":"50c90cc3fc356de4fd7e736202db2e77355f9b94e4bf3e1b38ed2d40abde1911"} Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.781686 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" event={"ID":"f84f49f5-4850-4358-8bc7-5287adddbe4b","Type":"ContainerStarted","Data":"fe557ddca785e369a8d21a4ae274db10c895a149476fbc7dcd05eb95d0035bff"} Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.784102 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6-config-wmzd6" event={"ID":"3a948fc6-1260-482f-a2c5-1c7a3f0b875e","Type":"ContainerDied","Data":"b055bd7c9ae18ae8c9597f5d193639f5b94016bbf1cf52348ab0635f3724a027"} Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.784132 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-wmzd6" Mar 20 16:21:29 crc kubenswrapper[4675]: I0320 16:21:29.784141 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b055bd7c9ae18ae8c9597f5d193639f5b94016bbf1cf52348ab0635f3724a027" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.152048 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b5cf6-config-wmzd6"] Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.161628 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b5cf6-config-wmzd6"] Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.262790 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b5cf6-config-t8nsh"] Mar 20 16:21:30 crc kubenswrapper[4675]: E0320 16:21:30.263194 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a948fc6-1260-482f-a2c5-1c7a3f0b875e" containerName="ovn-config" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.263214 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a948fc6-1260-482f-a2c5-1c7a3f0b875e" containerName="ovn-config" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.263434 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a948fc6-1260-482f-a2c5-1c7a3f0b875e" containerName="ovn-config" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.264086 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.265981 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.269212 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b5cf6-config-t8nsh"] Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.286845 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-log-ovn\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.287214 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.287262 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-scripts\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.287280 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run-ovn\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.287334 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrn2\" (UniqueName: \"kubernetes.io/projected/89fd75bc-5bfa-435e-abab-cc24c102dea7-kube-api-access-2mrn2\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.287381 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-additional-scripts\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389195 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-log-ovn\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389261 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389296 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-scripts\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389315 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run-ovn\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389344 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrn2\" (UniqueName: \"kubernetes.io/projected/89fd75bc-5bfa-435e-abab-cc24c102dea7-kube-api-access-2mrn2\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389377 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-additional-scripts\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389590 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run-ovn\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389733 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.389901 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-log-ovn\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.390311 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-additional-scripts\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.391688 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-scripts\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.412413 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrn2\" (UniqueName: \"kubernetes.io/projected/89fd75bc-5bfa-435e-abab-cc24c102dea7-kube-api-access-2mrn2\") pod \"ovn-controller-b5cf6-config-t8nsh\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.578284 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:30 crc kubenswrapper[4675]: I0320 16:21:30.694824 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a948fc6-1260-482f-a2c5-1c7a3f0b875e" path="/var/lib/kubelet/pods/3a948fc6-1260-482f-a2c5-1c7a3f0b875e/volumes" Mar 20 16:21:32 crc kubenswrapper[4675]: I0320 16:21:32.147486 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 16:21:36 crc kubenswrapper[4675]: I0320 16:21:36.840623 4675 generic.go:334] "Generic (PLEG): container finished" podID="87f2f4be-70c8-409a-8fe8-c753758021f4" containerID="20d1a869ae17f1da58e250b68e2d939922c644729f6168caddf3a38b6b9e7da3" exitCode=0 Mar 20 16:21:36 crc kubenswrapper[4675]: I0320 16:21:36.840729 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87f2f4be-70c8-409a-8fe8-c753758021f4","Type":"ContainerDied","Data":"20d1a869ae17f1da58e250b68e2d939922c644729f6168caddf3a38b6b9e7da3"} Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.307074 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b5cf6-config-t8nsh"] Mar 20 16:21:37 crc kubenswrapper[4675]: W0320 16:21:37.314079 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89fd75bc_5bfa_435e_abab_cc24c102dea7.slice/crio-434d33626e35beb4bfadd3d147f9acf9f270ed485da5a551b832bb4d6356fa8d WatchSource:0}: Error finding container 434d33626e35beb4bfadd3d147f9acf9f270ed485da5a551b832bb4d6356fa8d: Status 404 returned error can't find the container with id 434d33626e35beb4bfadd3d147f9acf9f270ed485da5a551b832bb4d6356fa8d Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.850147 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ddcp2" event={"ID":"97e8338f-ae50-4341-aba0-91bf9890a9bc","Type":"ContainerStarted","Data":"bd0c8ed35413f56ff4910847636a652a035f6c024fc37b2bdcfd71375215187a"} Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.852574 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87f2f4be-70c8-409a-8fe8-c753758021f4","Type":"ContainerStarted","Data":"bd6a843593a312031ede17366745d9c79dbe13f409dc4264d7575b707328e0c5"} Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.852786 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.854447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6-config-t8nsh" event={"ID":"89fd75bc-5bfa-435e-abab-cc24c102dea7","Type":"ContainerStarted","Data":"0d366b2fd4c42f6eb5252b1d9ff8eb297dd7041e1bf2eaaf03e89294dc825a96"} Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.854473 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6-config-t8nsh" event={"ID":"89fd75bc-5bfa-435e-abab-cc24c102dea7","Type":"ContainerStarted","Data":"434d33626e35beb4bfadd3d147f9acf9f270ed485da5a551b832bb4d6356fa8d"} Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.856152 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" event={"ID":"f84f49f5-4850-4358-8bc7-5287adddbe4b","Type":"ContainerStarted","Data":"811e0d5f5f206ff8e37fa6c22673e584e125149cb7874f1ac5142cd59637feb3"} Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.856339 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.867611 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ddcp2" podStartSLOduration=2.263611848 podStartE2EDuration="18.867595772s" podCreationTimestamp="2026-03-20 16:21:19 +0000 UTC" firstStartedPulling="2026-03-20 16:21:20.289243098 +0000 UTC m=+1200.322872635" lastFinishedPulling="2026-03-20 16:21:36.893227022 +0000 UTC m=+1216.926856559" observedRunningTime="2026-03-20 16:21:37.863336831 +0000 UTC m=+1217.896966368" watchObservedRunningTime="2026-03-20 16:21:37.867595772 +0000 UTC m=+1217.901225309" Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.889122 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-b5cf6-config-t8nsh" podStartSLOduration=7.889099979 podStartE2EDuration="7.889099979s" podCreationTimestamp="2026-03-20 16:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:37.882968606 +0000 UTC m=+1217.916598153" watchObservedRunningTime="2026-03-20 16:21:37.889099979 +0000 UTC m=+1217.922729516" Mar 20 16:21:37 crc kubenswrapper[4675]: I0320 16:21:37.909658 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.841694293 podStartE2EDuration="1m23.90964118s" podCreationTimestamp="2026-03-20 16:20:14 +0000 UTC" firstStartedPulling="2026-03-20 16:20:26.045851777 +0000 UTC m=+1146.079481314" lastFinishedPulling="2026-03-20 16:21:04.113798664 +0000 UTC m=+1184.147428201" observedRunningTime="2026-03-20 16:21:37.906016777 +0000 UTC m=+1217.939646324" watchObservedRunningTime="2026-03-20 16:21:37.90964118 +0000 UTC m=+1217.943270717" Mar 20 16:21:38 crc kubenswrapper[4675]: I0320 16:21:38.865989 4675 generic.go:334] "Generic (PLEG): container finished" podID="89fd75bc-5bfa-435e-abab-cc24c102dea7" containerID="0d366b2fd4c42f6eb5252b1d9ff8eb297dd7041e1bf2eaaf03e89294dc825a96" exitCode=0 Mar 20 16:21:38 crc kubenswrapper[4675]: I0320 16:21:38.866138 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b5cf6-config-t8nsh" event={"ID":"89fd75bc-5bfa-435e-abab-cc24c102dea7","Type":"ContainerDied","Data":"0d366b2fd4c42f6eb5252b1d9ff8eb297dd7041e1bf2eaaf03e89294dc825a96"} Mar 20 16:21:38 crc kubenswrapper[4675]: I0320 16:21:38.888076 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" podStartSLOduration=10.888060234 podStartE2EDuration="10.888060234s" podCreationTimestamp="2026-03-20 16:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:37.930604932 +0000 UTC m=+1217.964234469" watchObservedRunningTime="2026-03-20 16:21:38.888060234 +0000 UTC m=+1218.921689771" Mar 20 16:21:39 crc kubenswrapper[4675]: I0320 16:21:39.878446 4675 generic.go:334] "Generic (PLEG): container finished" podID="f2786789-8885-42c4-9127-c0466e2212eb" containerID="1c9261d3e9ae31d8738e1009cecc91ec6036c05963b24fbe82f9ff9760c474b8" exitCode=0 Mar 20 16:21:39 crc kubenswrapper[4675]: I0320 16:21:39.878508 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2786789-8885-42c4-9127-c0466e2212eb","Type":"ContainerDied","Data":"1c9261d3e9ae31d8738e1009cecc91ec6036c05963b24fbe82f9ff9760c474b8"} Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.244506 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350459 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run\") pod \"89fd75bc-5bfa-435e-abab-cc24c102dea7\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350521 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-scripts\") pod \"89fd75bc-5bfa-435e-abab-cc24c102dea7\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350537 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run-ovn\") pod \"89fd75bc-5bfa-435e-abab-cc24c102dea7\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350594 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-additional-scripts\") pod \"89fd75bc-5bfa-435e-abab-cc24c102dea7\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350591 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run" (OuterVolumeSpecName: "var-run") pod "89fd75bc-5bfa-435e-abab-cc24c102dea7" (UID: "89fd75bc-5bfa-435e-abab-cc24c102dea7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350642 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "89fd75bc-5bfa-435e-abab-cc24c102dea7" (UID: "89fd75bc-5bfa-435e-abab-cc24c102dea7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350644 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrn2\" (UniqueName: \"kubernetes.io/projected/89fd75bc-5bfa-435e-abab-cc24c102dea7-kube-api-access-2mrn2\") pod \"89fd75bc-5bfa-435e-abab-cc24c102dea7\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350722 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-log-ovn\") pod \"89fd75bc-5bfa-435e-abab-cc24c102dea7\" (UID: \"89fd75bc-5bfa-435e-abab-cc24c102dea7\") " Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.350828 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "89fd75bc-5bfa-435e-abab-cc24c102dea7" (UID: "89fd75bc-5bfa-435e-abab-cc24c102dea7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.351325 4675 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.351343 4675 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.351355 4675 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89fd75bc-5bfa-435e-abab-cc24c102dea7-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.351467 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "89fd75bc-5bfa-435e-abab-cc24c102dea7" (UID: "89fd75bc-5bfa-435e-abab-cc24c102dea7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.351836 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-scripts" (OuterVolumeSpecName: "scripts") pod "89fd75bc-5bfa-435e-abab-cc24c102dea7" (UID: "89fd75bc-5bfa-435e-abab-cc24c102dea7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.354533 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fd75bc-5bfa-435e-abab-cc24c102dea7-kube-api-access-2mrn2" (OuterVolumeSpecName: "kube-api-access-2mrn2") pod "89fd75bc-5bfa-435e-abab-cc24c102dea7" (UID: "89fd75bc-5bfa-435e-abab-cc24c102dea7"). InnerVolumeSpecName "kube-api-access-2mrn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.376729 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b5cf6-config-t8nsh"] Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.384642 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b5cf6-config-t8nsh"] Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.453492 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.453529 4675 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/89fd75bc-5bfa-435e-abab-cc24c102dea7-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.453544 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrn2\" (UniqueName: \"kubernetes.io/projected/89fd75bc-5bfa-435e-abab-cc24c102dea7-kube-api-access-2mrn2\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.683829 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89fd75bc-5bfa-435e-abab-cc24c102dea7" path="/var/lib/kubelet/pods/89fd75bc-5bfa-435e-abab-cc24c102dea7/volumes" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.886695 4675 scope.go:117] "RemoveContainer" containerID="0d366b2fd4c42f6eb5252b1d9ff8eb297dd7041e1bf2eaaf03e89294dc825a96" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.887079 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b5cf6-config-t8nsh" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.888860 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2786789-8885-42c4-9127-c0466e2212eb","Type":"ContainerStarted","Data":"d79ddcd7789c6d3da7a23340ddea04ea1ee8e184c9f04a578142a49812420333"} Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.889846 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:21:40 crc kubenswrapper[4675]: I0320 16:21:40.935066 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371949.919731 podStartE2EDuration="1m26.935044959s" podCreationTimestamp="2026-03-20 16:20:14 +0000 UTC" firstStartedPulling="2026-03-20 16:20:27.452632802 +0000 UTC m=+1147.486262329" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:40.929288797 +0000 UTC m=+1220.962918344" watchObservedRunningTime="2026-03-20 16:21:40.935044959 +0000 UTC m=+1220.968674516" Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.461792 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.541865 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gvz5x"] Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.542084 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gvz5x" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerName="dnsmasq-dns" containerID="cri-o://973584f860f0c55b564ad7fd4b53111645f156751d4ae58a9eefc3f8838d78d9" gracePeriod=10 Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.915630 4675 generic.go:334] "Generic (PLEG): container finished" podID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerID="973584f860f0c55b564ad7fd4b53111645f156751d4ae58a9eefc3f8838d78d9" exitCode=0 Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.915684 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gvz5x" event={"ID":"a6f5e724-0428-4860-8b37-7c0517ac755d","Type":"ContainerDied","Data":"973584f860f0c55b564ad7fd4b53111645f156751d4ae58a9eefc3f8838d78d9"} Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.915994 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gvz5x" event={"ID":"a6f5e724-0428-4860-8b37-7c0517ac755d","Type":"ContainerDied","Data":"4798edb4a968ff249cc94b24245d5d18158314402fbb6d4a5323e8937d2c717d"} Mar 20 16:21:43 crc kubenswrapper[4675]: I0320 16:21:43.916029 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4798edb4a968ff249cc94b24245d5d18158314402fbb6d4a5323e8937d2c717d" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.071759 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.219076 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-config\") pod \"a6f5e724-0428-4860-8b37-7c0517ac755d\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.219179 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-sb\") pod \"a6f5e724-0428-4860-8b37-7c0517ac755d\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.219217 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-nb\") pod \"a6f5e724-0428-4860-8b37-7c0517ac755d\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.219236 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-dns-svc\") pod \"a6f5e724-0428-4860-8b37-7c0517ac755d\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.219351 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx5qs\" (UniqueName: \"kubernetes.io/projected/a6f5e724-0428-4860-8b37-7c0517ac755d-kube-api-access-nx5qs\") pod \"a6f5e724-0428-4860-8b37-7c0517ac755d\" (UID: \"a6f5e724-0428-4860-8b37-7c0517ac755d\") " Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.229002 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f5e724-0428-4860-8b37-7c0517ac755d-kube-api-access-nx5qs" (OuterVolumeSpecName: "kube-api-access-nx5qs") pod "a6f5e724-0428-4860-8b37-7c0517ac755d" (UID: "a6f5e724-0428-4860-8b37-7c0517ac755d"). InnerVolumeSpecName "kube-api-access-nx5qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.261801 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6f5e724-0428-4860-8b37-7c0517ac755d" (UID: "a6f5e724-0428-4860-8b37-7c0517ac755d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.265937 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6f5e724-0428-4860-8b37-7c0517ac755d" (UID: "a6f5e724-0428-4860-8b37-7c0517ac755d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.266101 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6f5e724-0428-4860-8b37-7c0517ac755d" (UID: "a6f5e724-0428-4860-8b37-7c0517ac755d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.268097 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-config" (OuterVolumeSpecName: "config") pod "a6f5e724-0428-4860-8b37-7c0517ac755d" (UID: "a6f5e724-0428-4860-8b37-7c0517ac755d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.320729 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx5qs\" (UniqueName: \"kubernetes.io/projected/a6f5e724-0428-4860-8b37-7c0517ac755d-kube-api-access-nx5qs\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.320762 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.320786 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.320795 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.320803 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6f5e724-0428-4860-8b37-7c0517ac755d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.924718 4675 generic.go:334] "Generic (PLEG): container finished" podID="97e8338f-ae50-4341-aba0-91bf9890a9bc" containerID="bd0c8ed35413f56ff4910847636a652a035f6c024fc37b2bdcfd71375215187a" exitCode=0 Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.924823 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ddcp2" event={"ID":"97e8338f-ae50-4341-aba0-91bf9890a9bc","Type":"ContainerDied","Data":"bd0c8ed35413f56ff4910847636a652a035f6c024fc37b2bdcfd71375215187a"} Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.924996 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gvz5x" Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.943660 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gvz5x"] Mar 20 16:21:44 crc kubenswrapper[4675]: I0320 16:21:44.951074 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gvz5x"] Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.314049 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.466832 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-db-sync-config-data\") pod \"97e8338f-ae50-4341-aba0-91bf9890a9bc\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.466883 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-config-data\") pod \"97e8338f-ae50-4341-aba0-91bf9890a9bc\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.466905 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt5gc\" (UniqueName: \"kubernetes.io/projected/97e8338f-ae50-4341-aba0-91bf9890a9bc-kube-api-access-wt5gc\") pod \"97e8338f-ae50-4341-aba0-91bf9890a9bc\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.466929 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-combined-ca-bundle\") pod \"97e8338f-ae50-4341-aba0-91bf9890a9bc\" (UID: \"97e8338f-ae50-4341-aba0-91bf9890a9bc\") " Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.472449 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "97e8338f-ae50-4341-aba0-91bf9890a9bc" (UID: "97e8338f-ae50-4341-aba0-91bf9890a9bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.472491 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e8338f-ae50-4341-aba0-91bf9890a9bc-kube-api-access-wt5gc" (OuterVolumeSpecName: "kube-api-access-wt5gc") pod "97e8338f-ae50-4341-aba0-91bf9890a9bc" (UID: "97e8338f-ae50-4341-aba0-91bf9890a9bc"). InnerVolumeSpecName "kube-api-access-wt5gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.509443 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97e8338f-ae50-4341-aba0-91bf9890a9bc" (UID: "97e8338f-ae50-4341-aba0-91bf9890a9bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.530158 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-config-data" (OuterVolumeSpecName: "config-data") pod "97e8338f-ae50-4341-aba0-91bf9890a9bc" (UID: "97e8338f-ae50-4341-aba0-91bf9890a9bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.568218 4675 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.568241 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.568251 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt5gc\" (UniqueName: \"kubernetes.io/projected/97e8338f-ae50-4341-aba0-91bf9890a9bc-kube-api-access-wt5gc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.568262 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e8338f-ae50-4341-aba0-91bf9890a9bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.684969 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" path="/var/lib/kubelet/pods/a6f5e724-0428-4860-8b37-7c0517ac755d/volumes" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.939330 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ddcp2" event={"ID":"97e8338f-ae50-4341-aba0-91bf9890a9bc","Type":"ContainerDied","Data":"cd4a87921b110e2421fb3d7efa9aed8a829a75f10dc9aa955c2c5ef75cde3143"} Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.939371 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd4a87921b110e2421fb3d7efa9aed8a829a75f10dc9aa955c2c5ef75cde3143" Mar 20 16:21:46 crc kubenswrapper[4675]: I0320 16:21:46.939456 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ddcp2" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.308728 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7shgn"] Mar 20 16:21:47 crc kubenswrapper[4675]: E0320 16:21:47.309806 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e8338f-ae50-4341-aba0-91bf9890a9bc" containerName="glance-db-sync" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.309842 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e8338f-ae50-4341-aba0-91bf9890a9bc" containerName="glance-db-sync" Mar 20 16:21:47 crc kubenswrapper[4675]: E0320 16:21:47.309883 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fd75bc-5bfa-435e-abab-cc24c102dea7" containerName="ovn-config" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.309890 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fd75bc-5bfa-435e-abab-cc24c102dea7" containerName="ovn-config" Mar 20 16:21:47 crc kubenswrapper[4675]: E0320 16:21:47.309904 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerName="dnsmasq-dns" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.309909 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerName="dnsmasq-dns" Mar 20 16:21:47 crc kubenswrapper[4675]: E0320 16:21:47.309920 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerName="init" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.309926 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerName="init" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.310180 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5e724-0428-4860-8b37-7c0517ac755d" containerName="dnsmasq-dns" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.310194 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fd75bc-5bfa-435e-abab-cc24c102dea7" containerName="ovn-config" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.310211 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e8338f-ae50-4341-aba0-91bf9890a9bc" containerName="glance-db-sync" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.311257 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.343136 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7shgn"] Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.380615 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.380658 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.380702 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-config\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.380730 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.380760 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.380800 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxt4\" (UniqueName: \"kubernetes.io/projected/06dfefff-e695-44b8-bb74-10ef9a589184-kube-api-access-kmxt4\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.481926 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.481984 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.482036 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-config\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.482068 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.482105 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.482131 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxt4\" (UniqueName: \"kubernetes.io/projected/06dfefff-e695-44b8-bb74-10ef9a589184-kube-api-access-kmxt4\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.482904 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.483050 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-config\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.483319 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.483456 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.484222 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.500185 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxt4\" (UniqueName: \"kubernetes.io/projected/06dfefff-e695-44b8-bb74-10ef9a589184-kube-api-access-kmxt4\") pod \"dnsmasq-dns-7ff5475cc9-7shgn\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:47 crc kubenswrapper[4675]: I0320 16:21:47.637796 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:48 crc kubenswrapper[4675]: I0320 16:21:48.088295 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7shgn"] Mar 20 16:21:48 crc kubenswrapper[4675]: I0320 16:21:48.953209 4675 generic.go:334] "Generic (PLEG): container finished" podID="06dfefff-e695-44b8-bb74-10ef9a589184" containerID="656791b8ce18a50cf9f243ec1a2e07c49d6ad11a51352693abfedd11a08e5532" exitCode=0 Mar 20 16:21:48 crc kubenswrapper[4675]: I0320 16:21:48.953305 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" event={"ID":"06dfefff-e695-44b8-bb74-10ef9a589184","Type":"ContainerDied","Data":"656791b8ce18a50cf9f243ec1a2e07c49d6ad11a51352693abfedd11a08e5532"} Mar 20 16:21:48 crc kubenswrapper[4675]: I0320 16:21:48.953467 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" event={"ID":"06dfefff-e695-44b8-bb74-10ef9a589184","Type":"ContainerStarted","Data":"682331a98e7cf87c23e5479cbd43866486e48afa51d2793cd56dd1badbcb2152"} Mar 20 16:21:49 crc kubenswrapper[4675]: I0320 16:21:49.964948 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" event={"ID":"06dfefff-e695-44b8-bb74-10ef9a589184","Type":"ContainerStarted","Data":"c72d885b2d21c9cbe9ddbeac4ec5c9e97da730c57c9f2a83cd129fe6a71f4ea0"} Mar 20 16:21:49 crc kubenswrapper[4675]: I0320 16:21:49.965259 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:49 crc kubenswrapper[4675]: I0320 16:21:49.988263 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" podStartSLOduration=2.988241781 podStartE2EDuration="2.988241781s" podCreationTimestamp="2026-03-20 16:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:49.984872196 +0000 UTC m=+1230.018501773" watchObservedRunningTime="2026-03-20 16:21:49.988241781 +0000 UTC m=+1230.021871318" Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.562901 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.858984 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.865703 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jp9g7"] Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.866706 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.892849 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jp9g7"] Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.982076 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d571-account-create-update-ghkph"] Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.982980 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.984818 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 16:21:55 crc kubenswrapper[4675]: I0320 16:21:55.996332 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d571-account-create-update-ghkph"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.029958 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p46rg\" (UniqueName: \"kubernetes.io/projected/2e6fe44b-0699-4235-af89-d546820b782a-kube-api-access-p46rg\") pod \"cinder-db-create-jp9g7\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.030054 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6fe44b-0699-4235-af89-d546820b782a-operator-scripts\") pod \"cinder-db-create-jp9g7\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.065429 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-kwslq"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.066623 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.075477 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kwslq"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.131830 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6fe44b-0699-4235-af89-d546820b782a-operator-scripts\") pod \"cinder-db-create-jp9g7\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.131923 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b7b710-7048-42c1-8215-c242f34da40f-operator-scripts\") pod \"cinder-d571-account-create-update-ghkph\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.131959 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nf98\" (UniqueName: \"kubernetes.io/projected/a0b7b710-7048-42c1-8215-c242f34da40f-kube-api-access-6nf98\") pod \"cinder-d571-account-create-update-ghkph\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.131977 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p46rg\" (UniqueName: \"kubernetes.io/projected/2e6fe44b-0699-4235-af89-d546820b782a-kube-api-access-p46rg\") pod \"cinder-db-create-jp9g7\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.132722 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6fe44b-0699-4235-af89-d546820b782a-operator-scripts\") pod \"cinder-db-create-jp9g7\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.154273 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p46rg\" (UniqueName: \"kubernetes.io/projected/2e6fe44b-0699-4235-af89-d546820b782a-kube-api-access-p46rg\") pod \"cinder-db-create-jp9g7\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.160333 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b060-account-create-update-5gxch"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.161435 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.163492 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.172385 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b060-account-create-update-5gxch"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.182779 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jp9g7" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.223682 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wsz8z"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.225867 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.231231 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.231470 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk87c" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.231562 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.231593 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.233378 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nf98\" (UniqueName: \"kubernetes.io/projected/a0b7b710-7048-42c1-8215-c242f34da40f-kube-api-access-6nf98\") pod \"cinder-d571-account-create-update-ghkph\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.233462 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c297c7-2162-4bd6-bd83-db8bbc61d008-operator-scripts\") pod \"barbican-b060-account-create-update-5gxch\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.233494 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42f97\" (UniqueName: \"kubernetes.io/projected/45c297c7-2162-4bd6-bd83-db8bbc61d008-kube-api-access-42f97\") pod \"barbican-b060-account-create-update-5gxch\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.233550 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvns7\" (UniqueName: \"kubernetes.io/projected/9503b7d8-c02c-4d31-9711-271cd2be4778-kube-api-access-nvns7\") pod \"barbican-db-create-kwslq\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.233573 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b7b710-7048-42c1-8215-c242f34da40f-operator-scripts\") pod \"cinder-d571-account-create-update-ghkph\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.233599 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9503b7d8-c02c-4d31-9711-271cd2be4778-operator-scripts\") pod \"barbican-db-create-kwslq\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.234543 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b7b710-7048-42c1-8215-c242f34da40f-operator-scripts\") pod \"cinder-d571-account-create-update-ghkph\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.243726 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wsz8z"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.282621 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zj9wf"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.283609 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.291289 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nf98\" (UniqueName: \"kubernetes.io/projected/a0b7b710-7048-42c1-8215-c242f34da40f-kube-api-access-6nf98\") pod \"cinder-d571-account-create-update-ghkph\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.299025 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7038-account-create-update-9c4gc"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.300392 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.302183 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.304087 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.320456 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7038-account-create-update-9c4gc"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335559 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mcr\" (UniqueName: \"kubernetes.io/projected/3f81f694-f0b5-4f31-a090-748418d6fd08-kube-api-access-s4mcr\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335657 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c297c7-2162-4bd6-bd83-db8bbc61d008-operator-scripts\") pod \"barbican-b060-account-create-update-5gxch\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335689 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-combined-ca-bundle\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335725 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42f97\" (UniqueName: \"kubernetes.io/projected/45c297c7-2162-4bd6-bd83-db8bbc61d008-kube-api-access-42f97\") pod \"barbican-b060-account-create-update-5gxch\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335761 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-config-data\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335839 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvns7\" (UniqueName: \"kubernetes.io/projected/9503b7d8-c02c-4d31-9711-271cd2be4778-kube-api-access-nvns7\") pod \"barbican-db-create-kwslq\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.335881 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9503b7d8-c02c-4d31-9711-271cd2be4778-operator-scripts\") pod \"barbican-db-create-kwslq\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.336678 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9503b7d8-c02c-4d31-9711-271cd2be4778-operator-scripts\") pod \"barbican-db-create-kwslq\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.337509 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c297c7-2162-4bd6-bd83-db8bbc61d008-operator-scripts\") pod \"barbican-b060-account-create-update-5gxch\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.361907 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvns7\" (UniqueName: \"kubernetes.io/projected/9503b7d8-c02c-4d31-9711-271cd2be4778-kube-api-access-nvns7\") pod \"barbican-db-create-kwslq\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.361908 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42f97\" (UniqueName: \"kubernetes.io/projected/45c297c7-2162-4bd6-bd83-db8bbc61d008-kube-api-access-42f97\") pod \"barbican-b060-account-create-update-5gxch\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.373247 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.374795 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zj9wf"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.380603 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kwslq" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438031 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-config-data\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438105 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931f540a-ffd9-4d4a-b001-f68408fa02fb-operator-scripts\") pod \"neutron-db-create-zj9wf\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438177 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b97a81-a094-4cb5-ba97-33eb354c1d97-operator-scripts\") pod \"neutron-7038-account-create-update-9c4gc\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438230 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mcr\" (UniqueName: \"kubernetes.io/projected/3f81f694-f0b5-4f31-a090-748418d6fd08-kube-api-access-s4mcr\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438292 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476zp\" (UniqueName: \"kubernetes.io/projected/931f540a-ffd9-4d4a-b001-f68408fa02fb-kube-api-access-476zp\") pod \"neutron-db-create-zj9wf\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438314 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9k9\" (UniqueName: \"kubernetes.io/projected/42b97a81-a094-4cb5-ba97-33eb354c1d97-kube-api-access-4h9k9\") pod \"neutron-7038-account-create-update-9c4gc\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.438348 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-combined-ca-bundle\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.444464 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-combined-ca-bundle\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.447318 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-config-data\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.456554 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mcr\" (UniqueName: \"kubernetes.io/projected/3f81f694-f0b5-4f31-a090-748418d6fd08-kube-api-access-s4mcr\") pod \"keystone-db-sync-wsz8z\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.540285 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931f540a-ffd9-4d4a-b001-f68408fa02fb-operator-scripts\") pod \"neutron-db-create-zj9wf\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.540390 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b97a81-a094-4cb5-ba97-33eb354c1d97-operator-scripts\") pod \"neutron-7038-account-create-update-9c4gc\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.540497 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9k9\" (UniqueName: \"kubernetes.io/projected/42b97a81-a094-4cb5-ba97-33eb354c1d97-kube-api-access-4h9k9\") pod \"neutron-7038-account-create-update-9c4gc\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.540529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476zp\" (UniqueName: \"kubernetes.io/projected/931f540a-ffd9-4d4a-b001-f68408fa02fb-kube-api-access-476zp\") pod \"neutron-db-create-zj9wf\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.542558 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931f540a-ffd9-4d4a-b001-f68408fa02fb-operator-scripts\") pod \"neutron-db-create-zj9wf\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.543162 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b97a81-a094-4cb5-ba97-33eb354c1d97-operator-scripts\") pod \"neutron-7038-account-create-update-9c4gc\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.562039 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476zp\" (UniqueName: \"kubernetes.io/projected/931f540a-ffd9-4d4a-b001-f68408fa02fb-kube-api-access-476zp\") pod \"neutron-db-create-zj9wf\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.563378 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9k9\" (UniqueName: \"kubernetes.io/projected/42b97a81-a094-4cb5-ba97-33eb354c1d97-kube-api-access-4h9k9\") pod \"neutron-7038-account-create-update-9c4gc\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.683362 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.700861 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zj9wf" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.712540 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.729591 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jp9g7"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.798459 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d571-account-create-update-ghkph"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.917866 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kwslq"] Mar 20 16:21:56 crc kubenswrapper[4675]: I0320 16:21:56.997396 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b060-account-create-update-5gxch"] Mar 20 16:21:57 crc kubenswrapper[4675]: W0320 16:21:57.006409 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c297c7_2162_4bd6_bd83_db8bbc61d008.slice/crio-25677d08d455fb8f95d4b1e1b92a46f867b23e2bd9f9275d4b8a46ce4b953cae WatchSource:0}: Error finding container 25677d08d455fb8f95d4b1e1b92a46f867b23e2bd9f9275d4b8a46ce4b953cae: Status 404 returned error can't find the container with id 25677d08d455fb8f95d4b1e1b92a46f867b23e2bd9f9275d4b8a46ce4b953cae Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.033934 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d571-account-create-update-ghkph" event={"ID":"a0b7b710-7048-42c1-8215-c242f34da40f","Type":"ContainerStarted","Data":"21714c3cd544611f44cb056931ac3088c0d1784a6f870cd10c8bf654041c650d"} Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.044975 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jp9g7" event={"ID":"2e6fe44b-0699-4235-af89-d546820b782a","Type":"ContainerStarted","Data":"c1cb804e88f5086e8ec7fa58d372bb57a29f8793c41cdc06c5c27398bc5dd652"} Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.045019 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jp9g7" event={"ID":"2e6fe44b-0699-4235-af89-d546820b782a","Type":"ContainerStarted","Data":"bc8fcde125ce0949419960cf9adaec2284564d013fa38c2ffd88f148a73470c7"} Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.050038 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kwslq" event={"ID":"9503b7d8-c02c-4d31-9711-271cd2be4778","Type":"ContainerStarted","Data":"32f231df29f6119186704088f3872a1097582cd3efe3ddb5c8b1d7de45c346e0"} Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.053941 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b060-account-create-update-5gxch" event={"ID":"45c297c7-2162-4bd6-bd83-db8bbc61d008","Type":"ContainerStarted","Data":"25677d08d455fb8f95d4b1e1b92a46f867b23e2bd9f9275d4b8a46ce4b953cae"} Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.063229 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-jp9g7" podStartSLOduration=2.063205542 podStartE2EDuration="2.063205542s" podCreationTimestamp="2026-03-20 16:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:57.059344335 +0000 UTC m=+1237.092973872" watchObservedRunningTime="2026-03-20 16:21:57.063205542 +0000 UTC m=+1237.096835079" Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.175316 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wsz8z"] Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.298968 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zj9wf"] Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.368128 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7038-account-create-update-9c4gc"] Mar 20 16:21:57 crc kubenswrapper[4675]: W0320 16:21:57.394637 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42b97a81_a094_4cb5_ba97_33eb354c1d97.slice/crio-e6f6138967e60f80b99711a938ba5addcad267f03e983c7c6e618f5ed594eb4d WatchSource:0}: Error finding container e6f6138967e60f80b99711a938ba5addcad267f03e983c7c6e618f5ed594eb4d: Status 404 returned error can't find the container with id e6f6138967e60f80b99711a938ba5addcad267f03e983c7c6e618f5ed594eb4d Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.641883 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.721465 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-rn9sn"] Mar 20 16:21:57 crc kubenswrapper[4675]: I0320 16:21:57.721691 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerName="dnsmasq-dns" containerID="cri-o://811e0d5f5f206ff8e37fa6c22673e584e125149cb7874f1ac5142cd59637feb3" gracePeriod=10 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.071095 4675 generic.go:334] "Generic (PLEG): container finished" podID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerID="811e0d5f5f206ff8e37fa6c22673e584e125149cb7874f1ac5142cd59637feb3" exitCode=0 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.071161 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" event={"ID":"f84f49f5-4850-4358-8bc7-5287adddbe4b","Type":"ContainerDied","Data":"811e0d5f5f206ff8e37fa6c22673e584e125149cb7874f1ac5142cd59637feb3"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.075843 4675 generic.go:334] "Generic (PLEG): container finished" podID="9503b7d8-c02c-4d31-9711-271cd2be4778" containerID="eeb235a93af91eb5e0f789887f256f07d31ba071fcca0fb057b174f2a81edbda" exitCode=0 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.075949 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kwslq" event={"ID":"9503b7d8-c02c-4d31-9711-271cd2be4778","Type":"ContainerDied","Data":"eeb235a93af91eb5e0f789887f256f07d31ba071fcca0fb057b174f2a81edbda"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.079469 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7038-account-create-update-9c4gc" event={"ID":"42b97a81-a094-4cb5-ba97-33eb354c1d97","Type":"ContainerStarted","Data":"4934ba92db35aeda85badaa33f2c923e25abd75d0bfc1c658a88889c496fa48d"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.079511 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7038-account-create-update-9c4gc" event={"ID":"42b97a81-a094-4cb5-ba97-33eb354c1d97","Type":"ContainerStarted","Data":"e6f6138967e60f80b99711a938ba5addcad267f03e983c7c6e618f5ed594eb4d"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.084460 4675 generic.go:334] "Generic (PLEG): container finished" podID="45c297c7-2162-4bd6-bd83-db8bbc61d008" containerID="a035d354514f07a42784453cbf1c18708287a767cde5b5ff3ea3487becced2bf" exitCode=0 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.084546 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b060-account-create-update-5gxch" event={"ID":"45c297c7-2162-4bd6-bd83-db8bbc61d008","Type":"ContainerDied","Data":"a035d354514f07a42784453cbf1c18708287a767cde5b5ff3ea3487becced2bf"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.092692 4675 generic.go:334] "Generic (PLEG): container finished" podID="931f540a-ffd9-4d4a-b001-f68408fa02fb" containerID="76e4969eeedf95ae016802c002e9177a02620f38aadefdd4b2125a86f9955aec" exitCode=0 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.092760 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zj9wf" event={"ID":"931f540a-ffd9-4d4a-b001-f68408fa02fb","Type":"ContainerDied","Data":"76e4969eeedf95ae016802c002e9177a02620f38aadefdd4b2125a86f9955aec"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.092797 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zj9wf" event={"ID":"931f540a-ffd9-4d4a-b001-f68408fa02fb","Type":"ContainerStarted","Data":"3c1b721f2795fbf054ab80197b2f66893776f3d4c7161b632d2022042209b4ea"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.101356 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wsz8z" event={"ID":"3f81f694-f0b5-4f31-a090-748418d6fd08","Type":"ContainerStarted","Data":"635254e0a4f9cf2d1b2fcd72885ddb5621dc81af54c2c5ce4849440f0188193f"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.110133 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d571-account-create-update-ghkph" event={"ID":"a0b7b710-7048-42c1-8215-c242f34da40f","Type":"ContainerDied","Data":"6d9fc7c5607ea1e397ebe2446d4852a94c0858c3cc96acd50e6a521a74349272"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.110328 4675 generic.go:334] "Generic (PLEG): container finished" podID="a0b7b710-7048-42c1-8215-c242f34da40f" containerID="6d9fc7c5607ea1e397ebe2446d4852a94c0858c3cc96acd50e6a521a74349272" exitCode=0 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.112643 4675 generic.go:334] "Generic (PLEG): container finished" podID="2e6fe44b-0699-4235-af89-d546820b782a" containerID="c1cb804e88f5086e8ec7fa58d372bb57a29f8793c41cdc06c5c27398bc5dd652" exitCode=0 Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.112670 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jp9g7" event={"ID":"2e6fe44b-0699-4235-af89-d546820b782a","Type":"ContainerDied","Data":"c1cb804e88f5086e8ec7fa58d372bb57a29f8793c41cdc06c5c27398bc5dd652"} Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.113544 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7038-account-create-update-9c4gc" podStartSLOduration=2.113521177 podStartE2EDuration="2.113521177s" podCreationTimestamp="2026-03-20 16:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:58.10751695 +0000 UTC m=+1238.141146487" watchObservedRunningTime="2026-03-20 16:21:58.113521177 +0000 UTC m=+1238.147150714" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.167242 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.275891 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-svc\") pod \"f84f49f5-4850-4358-8bc7-5287adddbe4b\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.276392 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48jtw\" (UniqueName: \"kubernetes.io/projected/f84f49f5-4850-4358-8bc7-5287adddbe4b-kube-api-access-48jtw\") pod \"f84f49f5-4850-4358-8bc7-5287adddbe4b\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.276483 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-swift-storage-0\") pod \"f84f49f5-4850-4358-8bc7-5287adddbe4b\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.276549 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-sb\") pod \"f84f49f5-4850-4358-8bc7-5287adddbe4b\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.276614 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-config\") pod \"f84f49f5-4850-4358-8bc7-5287adddbe4b\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.276649 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-nb\") pod \"f84f49f5-4850-4358-8bc7-5287adddbe4b\" (UID: \"f84f49f5-4850-4358-8bc7-5287adddbe4b\") " Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.292378 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84f49f5-4850-4358-8bc7-5287adddbe4b-kube-api-access-48jtw" (OuterVolumeSpecName: "kube-api-access-48jtw") pod "f84f49f5-4850-4358-8bc7-5287adddbe4b" (UID: "f84f49f5-4850-4358-8bc7-5287adddbe4b"). InnerVolumeSpecName "kube-api-access-48jtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.327924 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f84f49f5-4850-4358-8bc7-5287adddbe4b" (UID: "f84f49f5-4850-4358-8bc7-5287adddbe4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.339879 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-config" (OuterVolumeSpecName: "config") pod "f84f49f5-4850-4358-8bc7-5287adddbe4b" (UID: "f84f49f5-4850-4358-8bc7-5287adddbe4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.348348 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f84f49f5-4850-4358-8bc7-5287adddbe4b" (UID: "f84f49f5-4850-4358-8bc7-5287adddbe4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.359693 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f84f49f5-4850-4358-8bc7-5287adddbe4b" (UID: "f84f49f5-4850-4358-8bc7-5287adddbe4b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.378970 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.379237 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.379388 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48jtw\" (UniqueName: \"kubernetes.io/projected/f84f49f5-4850-4358-8bc7-5287adddbe4b-kube-api-access-48jtw\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.379518 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.379621 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.379341 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f84f49f5-4850-4358-8bc7-5287adddbe4b" (UID: "f84f49f5-4850-4358-8bc7-5287adddbe4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:58 crc kubenswrapper[4675]: I0320 16:21:58.480949 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f84f49f5-4850-4358-8bc7-5287adddbe4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.122513 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.122903 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-rn9sn" event={"ID":"f84f49f5-4850-4358-8bc7-5287adddbe4b","Type":"ContainerDied","Data":"fe557ddca785e369a8d21a4ae274db10c895a149476fbc7dcd05eb95d0035bff"} Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.122952 4675 scope.go:117] "RemoveContainer" containerID="811e0d5f5f206ff8e37fa6c22673e584e125149cb7874f1ac5142cd59637feb3" Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.124547 4675 generic.go:334] "Generic (PLEG): container finished" podID="42b97a81-a094-4cb5-ba97-33eb354c1d97" containerID="4934ba92db35aeda85badaa33f2c923e25abd75d0bfc1c658a88889c496fa48d" exitCode=0 Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.124633 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7038-account-create-update-9c4gc" event={"ID":"42b97a81-a094-4cb5-ba97-33eb354c1d97","Type":"ContainerDied","Data":"4934ba92db35aeda85badaa33f2c923e25abd75d0bfc1c658a88889c496fa48d"} Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.155129 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-rn9sn"] Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.161998 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-rn9sn"] Mar 20 16:21:59 crc kubenswrapper[4675]: I0320 16:21:59.167862 4675 scope.go:117] "RemoveContainer" containerID="50c90cc3fc356de4fd7e736202db2e77355f9b94e4bf3e1b38ed2d40abde1911" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.130003 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567062-zt586"] Mar 20 16:22:00 crc kubenswrapper[4675]: E0320 16:22:00.130755 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerName="init" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.130782 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerName="init" Mar 20 16:22:00 crc kubenswrapper[4675]: E0320 16:22:00.130801 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerName="dnsmasq-dns" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.130807 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerName="dnsmasq-dns" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.130992 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" containerName="dnsmasq-dns" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.131804 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.134529 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.134792 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.135123 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.161027 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-zt586"] Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.210805 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5r4t\" (UniqueName: \"kubernetes.io/projected/04d5bdd9-188b-429c-a240-41424a96b5e4-kube-api-access-n5r4t\") pod \"auto-csr-approver-29567062-zt586\" (UID: \"04d5bdd9-188b-429c-a240-41424a96b5e4\") " pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.317270 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5r4t\" (UniqueName: \"kubernetes.io/projected/04d5bdd9-188b-429c-a240-41424a96b5e4-kube-api-access-n5r4t\") pod \"auto-csr-approver-29567062-zt586\" (UID: \"04d5bdd9-188b-429c-a240-41424a96b5e4\") " pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.337507 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5r4t\" (UniqueName: \"kubernetes.io/projected/04d5bdd9-188b-429c-a240-41424a96b5e4-kube-api-access-n5r4t\") pod \"auto-csr-approver-29567062-zt586\" (UID: \"04d5bdd9-188b-429c-a240-41424a96b5e4\") " pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.462405 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:00 crc kubenswrapper[4675]: I0320 16:22:00.688968 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84f49f5-4850-4358-8bc7-5287adddbe4b" path="/var/lib/kubelet/pods/f84f49f5-4850-4358-8bc7-5287adddbe4b/volumes" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.570808 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kwslq" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.576795 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.657162 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.686843 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvns7\" (UniqueName: \"kubernetes.io/projected/9503b7d8-c02c-4d31-9711-271cd2be4778-kube-api-access-nvns7\") pod \"9503b7d8-c02c-4d31-9711-271cd2be4778\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.687000 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42f97\" (UniqueName: \"kubernetes.io/projected/45c297c7-2162-4bd6-bd83-db8bbc61d008-kube-api-access-42f97\") pod \"45c297c7-2162-4bd6-bd83-db8bbc61d008\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.687036 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9503b7d8-c02c-4d31-9711-271cd2be4778-operator-scripts\") pod \"9503b7d8-c02c-4d31-9711-271cd2be4778\" (UID: \"9503b7d8-c02c-4d31-9711-271cd2be4778\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.687072 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c297c7-2162-4bd6-bd83-db8bbc61d008-operator-scripts\") pod \"45c297c7-2162-4bd6-bd83-db8bbc61d008\" (UID: \"45c297c7-2162-4bd6-bd83-db8bbc61d008\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.688189 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.688384 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9503b7d8-c02c-4d31-9711-271cd2be4778-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9503b7d8-c02c-4d31-9711-271cd2be4778" (UID: "9503b7d8-c02c-4d31-9711-271cd2be4778"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.688799 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c297c7-2162-4bd6-bd83-db8bbc61d008-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45c297c7-2162-4bd6-bd83-db8bbc61d008" (UID: "45c297c7-2162-4bd6-bd83-db8bbc61d008"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.693190 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c297c7-2162-4bd6-bd83-db8bbc61d008-kube-api-access-42f97" (OuterVolumeSpecName: "kube-api-access-42f97") pod "45c297c7-2162-4bd6-bd83-db8bbc61d008" (UID: "45c297c7-2162-4bd6-bd83-db8bbc61d008"). InnerVolumeSpecName "kube-api-access-42f97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.694544 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9503b7d8-c02c-4d31-9711-271cd2be4778-kube-api-access-nvns7" (OuterVolumeSpecName: "kube-api-access-nvns7") pod "9503b7d8-c02c-4d31-9711-271cd2be4778" (UID: "9503b7d8-c02c-4d31-9711-271cd2be4778"). InnerVolumeSpecName "kube-api-access-nvns7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.711744 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zj9wf" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.740475 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jp9g7" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.789835 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b97a81-a094-4cb5-ba97-33eb354c1d97-operator-scripts\") pod \"42b97a81-a094-4cb5-ba97-33eb354c1d97\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.789936 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-476zp\" (UniqueName: \"kubernetes.io/projected/931f540a-ffd9-4d4a-b001-f68408fa02fb-kube-api-access-476zp\") pod \"931f540a-ffd9-4d4a-b001-f68408fa02fb\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.790139 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nf98\" (UniqueName: \"kubernetes.io/projected/a0b7b710-7048-42c1-8215-c242f34da40f-kube-api-access-6nf98\") pod \"a0b7b710-7048-42c1-8215-c242f34da40f\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.790232 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931f540a-ffd9-4d4a-b001-f68408fa02fb-operator-scripts\") pod \"931f540a-ffd9-4d4a-b001-f68408fa02fb\" (UID: \"931f540a-ffd9-4d4a-b001-f68408fa02fb\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.790385 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b97a81-a094-4cb5-ba97-33eb354c1d97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42b97a81-a094-4cb5-ba97-33eb354c1d97" (UID: "42b97a81-a094-4cb5-ba97-33eb354c1d97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.790395 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b7b710-7048-42c1-8215-c242f34da40f-operator-scripts\") pod \"a0b7b710-7048-42c1-8215-c242f34da40f\" (UID: \"a0b7b710-7048-42c1-8215-c242f34da40f\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791046 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931f540a-ffd9-4d4a-b001-f68408fa02fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "931f540a-ffd9-4d4a-b001-f68408fa02fb" (UID: "931f540a-ffd9-4d4a-b001-f68408fa02fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.790569 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9k9\" (UniqueName: \"kubernetes.io/projected/42b97a81-a094-4cb5-ba97-33eb354c1d97-kube-api-access-4h9k9\") pod \"42b97a81-a094-4cb5-ba97-33eb354c1d97\" (UID: \"42b97a81-a094-4cb5-ba97-33eb354c1d97\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791335 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0b7b710-7048-42c1-8215-c242f34da40f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0b7b710-7048-42c1-8215-c242f34da40f" (UID: "a0b7b710-7048-42c1-8215-c242f34da40f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791893 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931f540a-ffd9-4d4a-b001-f68408fa02fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791911 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42f97\" (UniqueName: \"kubernetes.io/projected/45c297c7-2162-4bd6-bd83-db8bbc61d008-kube-api-access-42f97\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791930 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b7b710-7048-42c1-8215-c242f34da40f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791939 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9503b7d8-c02c-4d31-9711-271cd2be4778-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791948 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c297c7-2162-4bd6-bd83-db8bbc61d008-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791957 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b97a81-a094-4cb5-ba97-33eb354c1d97-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.791972 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvns7\" (UniqueName: \"kubernetes.io/projected/9503b7d8-c02c-4d31-9711-271cd2be4778-kube-api-access-nvns7\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.794823 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931f540a-ffd9-4d4a-b001-f68408fa02fb-kube-api-access-476zp" (OuterVolumeSpecName: "kube-api-access-476zp") pod "931f540a-ffd9-4d4a-b001-f68408fa02fb" (UID: "931f540a-ffd9-4d4a-b001-f68408fa02fb"). InnerVolumeSpecName "kube-api-access-476zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.794874 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b97a81-a094-4cb5-ba97-33eb354c1d97-kube-api-access-4h9k9" (OuterVolumeSpecName: "kube-api-access-4h9k9") pod "42b97a81-a094-4cb5-ba97-33eb354c1d97" (UID: "42b97a81-a094-4cb5-ba97-33eb354c1d97"). InnerVolumeSpecName "kube-api-access-4h9k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.795172 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b7b710-7048-42c1-8215-c242f34da40f-kube-api-access-6nf98" (OuterVolumeSpecName: "kube-api-access-6nf98") pod "a0b7b710-7048-42c1-8215-c242f34da40f" (UID: "a0b7b710-7048-42c1-8215-c242f34da40f"). InnerVolumeSpecName "kube-api-access-6nf98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.866270 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-zt586"] Mar 20 16:22:02 crc kubenswrapper[4675]: W0320 16:22:02.868316 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04d5bdd9_188b_429c_a240_41424a96b5e4.slice/crio-8f96e54de1b60ca63166ee8fb62133a1b5552d7e1109ce93bf9d12707e96b3e4 WatchSource:0}: Error finding container 8f96e54de1b60ca63166ee8fb62133a1b5552d7e1109ce93bf9d12707e96b3e4: Status 404 returned error can't find the container with id 8f96e54de1b60ca63166ee8fb62133a1b5552d7e1109ce93bf9d12707e96b3e4 Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893401 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6fe44b-0699-4235-af89-d546820b782a-operator-scripts\") pod \"2e6fe44b-0699-4235-af89-d546820b782a\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893444 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p46rg\" (UniqueName: \"kubernetes.io/projected/2e6fe44b-0699-4235-af89-d546820b782a-kube-api-access-p46rg\") pod \"2e6fe44b-0699-4235-af89-d546820b782a\" (UID: \"2e6fe44b-0699-4235-af89-d546820b782a\") " Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893815 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6fe44b-0699-4235-af89-d546820b782a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e6fe44b-0699-4235-af89-d546820b782a" (UID: "2e6fe44b-0699-4235-af89-d546820b782a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893958 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6fe44b-0699-4235-af89-d546820b782a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893975 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h9k9\" (UniqueName: \"kubernetes.io/projected/42b97a81-a094-4cb5-ba97-33eb354c1d97-kube-api-access-4h9k9\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893986 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-476zp\" (UniqueName: \"kubernetes.io/projected/931f540a-ffd9-4d4a-b001-f68408fa02fb-kube-api-access-476zp\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.893994 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nf98\" (UniqueName: \"kubernetes.io/projected/a0b7b710-7048-42c1-8215-c242f34da40f-kube-api-access-6nf98\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.896151 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6fe44b-0699-4235-af89-d546820b782a-kube-api-access-p46rg" (OuterVolumeSpecName: "kube-api-access-p46rg") pod "2e6fe44b-0699-4235-af89-d546820b782a" (UID: "2e6fe44b-0699-4235-af89-d546820b782a"). InnerVolumeSpecName "kube-api-access-p46rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:02 crc kubenswrapper[4675]: I0320 16:22:02.995669 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p46rg\" (UniqueName: \"kubernetes.io/projected/2e6fe44b-0699-4235-af89-d546820b782a-kube-api-access-p46rg\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.181786 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-zt586" event={"ID":"04d5bdd9-188b-429c-a240-41424a96b5e4","Type":"ContainerStarted","Data":"8f96e54de1b60ca63166ee8fb62133a1b5552d7e1109ce93bf9d12707e96b3e4"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.183476 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jp9g7" event={"ID":"2e6fe44b-0699-4235-af89-d546820b782a","Type":"ContainerDied","Data":"bc8fcde125ce0949419960cf9adaec2284564d013fa38c2ffd88f148a73470c7"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.183504 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc8fcde125ce0949419960cf9adaec2284564d013fa38c2ffd88f148a73470c7" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.183556 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jp9g7" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.189717 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kwslq" event={"ID":"9503b7d8-c02c-4d31-9711-271cd2be4778","Type":"ContainerDied","Data":"32f231df29f6119186704088f3872a1097582cd3efe3ddb5c8b1d7de45c346e0"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.189785 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32f231df29f6119186704088f3872a1097582cd3efe3ddb5c8b1d7de45c346e0" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.189818 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kwslq" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.194876 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7038-account-create-update-9c4gc" event={"ID":"42b97a81-a094-4cb5-ba97-33eb354c1d97","Type":"ContainerDied","Data":"e6f6138967e60f80b99711a938ba5addcad267f03e983c7c6e618f5ed594eb4d"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.194898 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7038-account-create-update-9c4gc" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.194909 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6f6138967e60f80b99711a938ba5addcad267f03e983c7c6e618f5ed594eb4d" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.196269 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b060-account-create-update-5gxch" event={"ID":"45c297c7-2162-4bd6-bd83-db8bbc61d008","Type":"ContainerDied","Data":"25677d08d455fb8f95d4b1e1b92a46f867b23e2bd9f9275d4b8a46ce4b953cae"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.196298 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25677d08d455fb8f95d4b1e1b92a46f867b23e2bd9f9275d4b8a46ce4b953cae" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.196368 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b060-account-create-update-5gxch" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.198015 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zj9wf" event={"ID":"931f540a-ffd9-4d4a-b001-f68408fa02fb","Type":"ContainerDied","Data":"3c1b721f2795fbf054ab80197b2f66893776f3d4c7161b632d2022042209b4ea"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.198040 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1b721f2795fbf054ab80197b2f66893776f3d4c7161b632d2022042209b4ea" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.198198 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zj9wf" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.199376 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wsz8z" event={"ID":"3f81f694-f0b5-4f31-a090-748418d6fd08","Type":"ContainerStarted","Data":"1136e7f4d855412cb03c024ce9c2d6377c100158b937f00a07dd20c261f552c1"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.201142 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d571-account-create-update-ghkph" event={"ID":"a0b7b710-7048-42c1-8215-c242f34da40f","Type":"ContainerDied","Data":"21714c3cd544611f44cb056931ac3088c0d1784a6f870cd10c8bf654041c650d"} Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.201170 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21714c3cd544611f44cb056931ac3088c0d1784a6f870cd10c8bf654041c650d" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.201248 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d571-account-create-update-ghkph" Mar 20 16:22:03 crc kubenswrapper[4675]: I0320 16:22:03.233426 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wsz8z" podStartSLOduration=1.967681699 podStartE2EDuration="7.233403873s" podCreationTimestamp="2026-03-20 16:21:56 +0000 UTC" firstStartedPulling="2026-03-20 16:21:57.194323871 +0000 UTC m=+1237.227953408" lastFinishedPulling="2026-03-20 16:22:02.460046045 +0000 UTC m=+1242.493675582" observedRunningTime="2026-03-20 16:22:03.218528149 +0000 UTC m=+1243.252157696" watchObservedRunningTime="2026-03-20 16:22:03.233403873 +0000 UTC m=+1243.267033420" Mar 20 16:22:04 crc kubenswrapper[4675]: I0320 16:22:04.218644 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-zt586" event={"ID":"04d5bdd9-188b-429c-a240-41424a96b5e4","Type":"ContainerStarted","Data":"f7ec49d56f30163de0e7cac0297ccb3050fb22c3498b7930d7e7d60846e5a65e"} Mar 20 16:22:04 crc kubenswrapper[4675]: I0320 16:22:04.234490 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567062-zt586" podStartSLOduration=3.26528288 podStartE2EDuration="4.234471207s" podCreationTimestamp="2026-03-20 16:22:00 +0000 UTC" firstStartedPulling="2026-03-20 16:22:02.870887506 +0000 UTC m=+1242.904517043" lastFinishedPulling="2026-03-20 16:22:03.840075823 +0000 UTC m=+1243.873705370" observedRunningTime="2026-03-20 16:22:04.233429928 +0000 UTC m=+1244.267059485" watchObservedRunningTime="2026-03-20 16:22:04.234471207 +0000 UTC m=+1244.268100744" Mar 20 16:22:05 crc kubenswrapper[4675]: I0320 16:22:05.230135 4675 generic.go:334] "Generic (PLEG): container finished" podID="04d5bdd9-188b-429c-a240-41424a96b5e4" containerID="f7ec49d56f30163de0e7cac0297ccb3050fb22c3498b7930d7e7d60846e5a65e" exitCode=0 Mar 20 16:22:05 crc kubenswrapper[4675]: I0320 16:22:05.230251 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-zt586" event={"ID":"04d5bdd9-188b-429c-a240-41424a96b5e4","Type":"ContainerDied","Data":"f7ec49d56f30163de0e7cac0297ccb3050fb22c3498b7930d7e7d60846e5a65e"} Mar 20 16:22:06 crc kubenswrapper[4675]: I0320 16:22:06.239981 4675 generic.go:334] "Generic (PLEG): container finished" podID="3f81f694-f0b5-4f31-a090-748418d6fd08" containerID="1136e7f4d855412cb03c024ce9c2d6377c100158b937f00a07dd20c261f552c1" exitCode=0 Mar 20 16:22:06 crc kubenswrapper[4675]: I0320 16:22:06.240059 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wsz8z" event={"ID":"3f81f694-f0b5-4f31-a090-748418d6fd08","Type":"ContainerDied","Data":"1136e7f4d855412cb03c024ce9c2d6377c100158b937f00a07dd20c261f552c1"} Mar 20 16:22:06 crc kubenswrapper[4675]: I0320 16:22:06.550907 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:06 crc kubenswrapper[4675]: I0320 16:22:06.653039 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5r4t\" (UniqueName: \"kubernetes.io/projected/04d5bdd9-188b-429c-a240-41424a96b5e4-kube-api-access-n5r4t\") pod \"04d5bdd9-188b-429c-a240-41424a96b5e4\" (UID: \"04d5bdd9-188b-429c-a240-41424a96b5e4\") " Mar 20 16:22:06 crc kubenswrapper[4675]: I0320 16:22:06.659525 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d5bdd9-188b-429c-a240-41424a96b5e4-kube-api-access-n5r4t" (OuterVolumeSpecName: "kube-api-access-n5r4t") pod "04d5bdd9-188b-429c-a240-41424a96b5e4" (UID: "04d5bdd9-188b-429c-a240-41424a96b5e4"). InnerVolumeSpecName "kube-api-access-n5r4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:06 crc kubenswrapper[4675]: I0320 16:22:06.754279 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5r4t\" (UniqueName: \"kubernetes.io/projected/04d5bdd9-188b-429c-a240-41424a96b5e4-kube-api-access-n5r4t\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.250422 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-zt586" event={"ID":"04d5bdd9-188b-429c-a240-41424a96b5e4","Type":"ContainerDied","Data":"8f96e54de1b60ca63166ee8fb62133a1b5552d7e1109ce93bf9d12707e96b3e4"} Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.250796 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f96e54de1b60ca63166ee8fb62133a1b5552d7e1109ce93bf9d12707e96b3e4" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.250433 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-zt586" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.320574 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-7z8lw"] Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.329273 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-7z8lw"] Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.611166 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.667635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-combined-ca-bundle\") pod \"3f81f694-f0b5-4f31-a090-748418d6fd08\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.667706 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4mcr\" (UniqueName: \"kubernetes.io/projected/3f81f694-f0b5-4f31-a090-748418d6fd08-kube-api-access-s4mcr\") pod \"3f81f694-f0b5-4f31-a090-748418d6fd08\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.667815 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-config-data\") pod \"3f81f694-f0b5-4f31-a090-748418d6fd08\" (UID: \"3f81f694-f0b5-4f31-a090-748418d6fd08\") " Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.674584 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f81f694-f0b5-4f31-a090-748418d6fd08-kube-api-access-s4mcr" (OuterVolumeSpecName: "kube-api-access-s4mcr") pod "3f81f694-f0b5-4f31-a090-748418d6fd08" (UID: "3f81f694-f0b5-4f31-a090-748418d6fd08"). InnerVolumeSpecName "kube-api-access-s4mcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.695113 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f81f694-f0b5-4f31-a090-748418d6fd08" (UID: "3f81f694-f0b5-4f31-a090-748418d6fd08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.715399 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-config-data" (OuterVolumeSpecName: "config-data") pod "3f81f694-f0b5-4f31-a090-748418d6fd08" (UID: "3f81f694-f0b5-4f31-a090-748418d6fd08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.769850 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.769942 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4mcr\" (UniqueName: \"kubernetes.io/projected/3f81f694-f0b5-4f31-a090-748418d6fd08-kube-api-access-s4mcr\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:07 crc kubenswrapper[4675]: I0320 16:22:07.769955 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f81f694-f0b5-4f31-a090-748418d6fd08-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.267736 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wsz8z" event={"ID":"3f81f694-f0b5-4f31-a090-748418d6fd08","Type":"ContainerDied","Data":"635254e0a4f9cf2d1b2fcd72885ddb5621dc81af54c2c5ce4849440f0188193f"} Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.267816 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635254e0a4f9cf2d1b2fcd72885ddb5621dc81af54c2c5ce4849440f0188193f" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.267914 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wsz8z" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.601470 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ncgfd"] Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.601921 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9503b7d8-c02c-4d31-9711-271cd2be4778" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.601946 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9503b7d8-c02c-4d31-9711-271cd2be4778" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.601972 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f81f694-f0b5-4f31-a090-748418d6fd08" containerName="keystone-db-sync" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.601983 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f81f694-f0b5-4f31-a090-748418d6fd08" containerName="keystone-db-sync" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.602001 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b97a81-a094-4cb5-ba97-33eb354c1d97" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602011 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b97a81-a094-4cb5-ba97-33eb354c1d97" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.602021 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5bdd9-188b-429c-a240-41424a96b5e4" containerName="oc" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602030 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5bdd9-188b-429c-a240-41424a96b5e4" containerName="oc" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.602044 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931f540a-ffd9-4d4a-b001-f68408fa02fb" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602053 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="931f540a-ffd9-4d4a-b001-f68408fa02fb" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.602071 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b7b710-7048-42c1-8215-c242f34da40f" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602079 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b7b710-7048-42c1-8215-c242f34da40f" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.602091 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c297c7-2162-4bd6-bd83-db8bbc61d008" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602099 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c297c7-2162-4bd6-bd83-db8bbc61d008" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: E0320 16:22:08.602116 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6fe44b-0699-4235-af89-d546820b782a" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602156 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6fe44b-0699-4235-af89-d546820b782a" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602372 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c297c7-2162-4bd6-bd83-db8bbc61d008" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602391 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="931f540a-ffd9-4d4a-b001-f68408fa02fb" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602405 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b97a81-a094-4cb5-ba97-33eb354c1d97" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602417 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d5bdd9-188b-429c-a240-41424a96b5e4" containerName="oc" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602431 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9503b7d8-c02c-4d31-9711-271cd2be4778" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602442 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f81f694-f0b5-4f31-a090-748418d6fd08" containerName="keystone-db-sync" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602455 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6fe44b-0699-4235-af89-d546820b782a" containerName="mariadb-database-create" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.602469 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b7b710-7048-42c1-8215-c242f34da40f" containerName="mariadb-account-create-update" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.603168 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.605945 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.606585 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.607534 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk87c" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.608110 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.610624 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.629715 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ncgfd"] Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.648360 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-c89l4"] Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.658035 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.685883 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-credential-keys\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.686228 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-config-data\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.686360 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-scripts\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.686514 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-combined-ca-bundle\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.686656 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsl9t\" (UniqueName: \"kubernetes.io/projected/1bec8a0d-764c-4502-a128-6e03564486f7-kube-api-access-gsl9t\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.686756 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-fernet-keys\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.706273 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6" path="/var/lib/kubelet/pods/3e86ec3a-53d1-4c1d-825b-ea7ec54c04a6/volumes" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.707832 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-c89l4"] Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789561 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789611 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-combined-ca-bundle\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789643 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsl9t\" (UniqueName: \"kubernetes.io/projected/1bec8a0d-764c-4502-a128-6e03564486f7-kube-api-access-gsl9t\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789663 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-fernet-keys\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789682 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789710 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftz6\" (UniqueName: \"kubernetes.io/projected/788934a5-2d83-4bbd-a3fa-3deec63c322c-kube-api-access-4ftz6\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789744 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-config\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789804 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789841 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-credential-keys\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789859 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-config-data\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789885 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-scripts\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.789910 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.808119 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-fernet-keys\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.818308 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-scripts\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.818636 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-credential-keys\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.818776 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-config-data\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.819259 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-combined-ca-bundle\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.823506 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsl9t\" (UniqueName: \"kubernetes.io/projected/1bec8a0d-764c-4502-a128-6e03564486f7-kube-api-access-gsl9t\") pod \"keystone-bootstrap-ncgfd\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.831566 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bf9f9b95c-kgcr5"] Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.833028 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.843422 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8v7vv" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.843690 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.844305 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.844433 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.860352 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bf9f9b95c-kgcr5"] Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.893735 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.893808 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d94a2d76-92e2-4403-ad6d-e2124b400d78-logs\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.893880 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.893919 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ftz6\" (UniqueName: \"kubernetes.io/projected/788934a5-2d83-4bbd-a3fa-3deec63c322c-kube-api-access-4ftz6\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.893948 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d94a2d76-92e2-4403-ad6d-e2124b400d78-horizon-secret-key\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.893986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-config\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.894007 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-scripts\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.894040 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.894085 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-config-data\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.894114 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhrgg\" (UniqueName: \"kubernetes.io/projected/d94a2d76-92e2-4403-ad6d-e2124b400d78-kube-api-access-xhrgg\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.894162 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.895385 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.895487 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.896125 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.896380 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-config\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.896885 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.925229 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.996742 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-config-data\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.996833 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhrgg\" (UniqueName: \"kubernetes.io/projected/d94a2d76-92e2-4403-ad6d-e2124b400d78-kube-api-access-xhrgg\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.996922 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d94a2d76-92e2-4403-ad6d-e2124b400d78-logs\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.996986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d94a2d76-92e2-4403-ad6d-e2124b400d78-horizon-secret-key\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.997026 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-scripts\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.997903 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-scripts\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:08 crc kubenswrapper[4675]: I0320 16:22:08.998415 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d94a2d76-92e2-4403-ad6d-e2124b400d78-logs\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:08.999148 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-config-data\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.003044 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ddfbw"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.012542 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.005119 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d94a2d76-92e2-4403-ad6d-e2124b400d78-horizon-secret-key\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.033740 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ftz6\" (UniqueName: \"kubernetes.io/projected/788934a5-2d83-4bbd-a3fa-3deec63c322c-kube-api-access-4ftz6\") pod \"dnsmasq-dns-5c5cc7c5ff-c89l4\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.061320 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.061643 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pch7c" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.061824 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.094695 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.097250 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.099653 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-combined-ca-bundle\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.099706 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-config\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.099806 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnb27\" (UniqueName: \"kubernetes.io/projected/7935d7aa-cb6b-4b66-a58f-31e0cce41114-kube-api-access-xnb27\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.101329 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhrgg\" (UniqueName: \"kubernetes.io/projected/d94a2d76-92e2-4403-ad6d-e2124b400d78-kube-api-access-xhrgg\") pod \"horizon-5bf9f9b95c-kgcr5\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.140102 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.140806 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.184009 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tt28r"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.185241 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.191421 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xpldv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200432 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200636 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200735 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200782 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gltg\" (UniqueName: \"kubernetes.io/projected/7497e477-249c-4346-9087-458ba9e6c152-kube-api-access-8gltg\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200838 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnb27\" (UniqueName: \"kubernetes.io/projected/7935d7aa-cb6b-4b66-a58f-31e0cce41114-kube-api-access-xnb27\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200897 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-scripts\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200928 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-log-httpd\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200950 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200978 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-config-data\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.200998 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-combined-ca-bundle\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.201017 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-config\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.201041 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-run-httpd\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.211466 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-combined-ca-bundle\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.214978 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-config\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.231013 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ddfbw"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.237037 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.255049 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hhfc9"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.255652 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.256188 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.284955 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.278095 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.278873 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-scdtr" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.286578 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnb27\" (UniqueName: \"kubernetes.io/projected/7935d7aa-cb6b-4b66-a58f-31e0cce41114-kube-api-access-xnb27\") pod \"neutron-db-sync-ddfbw\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.311915 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-run-httpd\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312433 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312475 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gltg\" (UniqueName: \"kubernetes.io/projected/7497e477-249c-4346-9087-458ba9e6c152-kube-api-access-8gltg\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312520 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-combined-ca-bundle\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312663 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-etc-machine-id\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312698 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-scripts\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312719 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-db-sync-config-data\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-config-data\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312776 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkck2\" (UniqueName: \"kubernetes.io/projected/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-kube-api-access-zkck2\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312823 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-log-httpd\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312868 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-db-sync-config-data\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312896 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312912 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-combined-ca-bundle\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312944 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5927\" (UniqueName: \"kubernetes.io/projected/c1560aa0-d06c-4c98-80bf-0635065cac6f-kube-api-access-h5927\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.312974 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-config-data\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.313031 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-scripts\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.313519 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-run-httpd\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.313707 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-log-httpd\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.313028 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tt28r"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.328868 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-scripts\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.329818 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-config-data\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.367311 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.367485 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.385390 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hhfc9"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.391838 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gltg\" (UniqueName: \"kubernetes.io/projected/7497e477-249c-4346-9087-458ba9e6c152-kube-api-access-8gltg\") pod \"ceilometer-0\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.415804 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-etc-machine-id\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.415879 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-db-sync-config-data\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.415902 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-config-data\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.415929 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkck2\" (UniqueName: \"kubernetes.io/projected/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-kube-api-access-zkck2\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.415981 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-db-sync-config-data\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.416005 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-combined-ca-bundle\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.416032 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5927\" (UniqueName: \"kubernetes.io/projected/c1560aa0-d06c-4c98-80bf-0635065cac6f-kube-api-access-h5927\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.416075 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-scripts\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.416117 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-combined-ca-bundle\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.417893 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-etc-machine-id\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.433611 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.440205 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.448316 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-combined-ca-bundle\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.458055 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-db-sync-config-data\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.458541 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-scripts\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.474187 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-db-sync-config-data\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.485405 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-config-data\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.486471 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-combined-ca-bundle\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.493371 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5927\" (UniqueName: \"kubernetes.io/projected/c1560aa0-d06c-4c98-80bf-0635065cac6f-kube-api-access-h5927\") pod \"barbican-db-sync-hhfc9\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.493447 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-546bd65fb7-ktfbp"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.494784 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.517137 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkck2\" (UniqueName: \"kubernetes.io/projected/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-kube-api-access-zkck2\") pod \"cinder-db-sync-tt28r\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.522703 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt28r" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.576212 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-546bd65fb7-ktfbp"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.604442 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.622964 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-scripts\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.623053 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b282cfa4-8448-4eef-8463-fca67d9608fd-horizon-secret-key\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.623104 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-config-data\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.623135 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b282cfa4-8448-4eef-8463-fca67d9608fd-logs\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.623153 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs5lz\" (UniqueName: \"kubernetes.io/projected/b282cfa4-8448-4eef-8463-fca67d9608fd-kube-api-access-hs5lz\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.651152 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7tdnv"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.652537 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.659459 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5zhjr" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.659787 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.661511 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.705235 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.706586 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.711117 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.711506 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-859cw" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.711873 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.712422 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.715542 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7tdnv"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729263 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-scripts\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729348 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-config-data\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729376 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b282cfa4-8448-4eef-8463-fca67d9608fd-horizon-secret-key\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729431 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-combined-ca-bundle\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729459 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-scripts\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729501 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-config-data\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729525 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653f25dd-b7f2-4ec1-8569-96af48c4c388-logs\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729561 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b282cfa4-8448-4eef-8463-fca67d9608fd-logs\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729584 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5lz\" (UniqueName: \"kubernetes.io/projected/b282cfa4-8448-4eef-8463-fca67d9608fd-kube-api-access-hs5lz\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.729638 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hz6\" (UniqueName: \"kubernetes.io/projected/653f25dd-b7f2-4ec1-8569-96af48c4c388-kube-api-access-l6hz6\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.730439 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-scripts\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.731601 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b282cfa4-8448-4eef-8463-fca67d9608fd-logs\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.731997 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-config-data\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.737882 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.742736 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-c89l4"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.755194 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b282cfa4-8448-4eef-8463-fca67d9608fd-horizon-secret-key\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.762998 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-s745j"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.764417 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.770397 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs5lz\" (UniqueName: \"kubernetes.io/projected/b282cfa4-8448-4eef-8463-fca67d9608fd-kube-api-access-hs5lz\") pod \"horizon-546bd65fb7-ktfbp\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.791089 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-s745j"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.808275 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.810414 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.816235 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.816436 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.817065 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.830733 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.831685 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-scripts\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.834302 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.834525 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-logs\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.834645 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-config-data\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.834795 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-combined-ca-bundle\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.834896 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-scripts\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.835073 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653f25dd-b7f2-4ec1-8569-96af48c4c388-logs\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.835227 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2l4r\" (UniqueName: \"kubernetes.io/projected/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-kube-api-access-v2l4r\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.835328 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.835428 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hz6\" (UniqueName: \"kubernetes.io/projected/653f25dd-b7f2-4ec1-8569-96af48c4c388-kube-api-access-l6hz6\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.835522 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.835622 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-config-data\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.832637 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.838150 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653f25dd-b7f2-4ec1-8569-96af48c4c388-logs\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.845462 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-scripts\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.846269 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-combined-ca-bundle\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.848408 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-config-data\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.891341 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hz6\" (UniqueName: \"kubernetes.io/projected/653f25dd-b7f2-4ec1-8569-96af48c4c388-kube-api-access-l6hz6\") pod \"placement-db-sync-7tdnv\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.941740 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.941812 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.941887 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kb8\" (UniqueName: \"kubernetes.io/projected/a85fb28f-af43-4d9b-acf3-71c4f4494daa-kube-api-access-59kb8\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.941920 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.941963 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942002 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2l4r\" (UniqueName: \"kubernetes.io/projected/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-kube-api-access-v2l4r\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942047 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942099 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942131 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942182 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942206 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-config-data\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942229 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942287 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942308 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942354 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942381 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-config\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942472 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-scripts\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942517 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.942547 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.943063 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-logs\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.943115 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8nl\" (UniqueName: \"kubernetes.io/projected/a29ce001-6800-43c1-9b4d-24be729f85b8-kube-api-access-jc8nl\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.944130 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.947077 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.953858 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-config-data\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.962874 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-logs\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.970985 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2l4r\" (UniqueName: \"kubernetes.io/projected/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-kube-api-access-v2l4r\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.973421 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-scripts\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.988607 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.990373 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:09 crc kubenswrapper[4675]: I0320 16:22:09.994010 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.027716 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044647 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044695 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044730 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kb8\" (UniqueName: \"kubernetes.io/projected/a85fb28f-af43-4d9b-acf3-71c4f4494daa-kube-api-access-59kb8\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044751 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044780 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044812 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044841 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044858 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044885 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044908 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044944 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044969 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-config\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.044986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.045018 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8nl\" (UniqueName: \"kubernetes.io/projected/a29ce001-6800-43c1-9b4d-24be729f85b8-kube-api-access-jc8nl\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.047140 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.047449 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-config\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.047998 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.048515 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.048725 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.049692 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.056548 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.057389 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.061370 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.062901 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.064017 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.064230 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.069901 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.073288 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ncgfd"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.074006 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kb8\" (UniqueName: \"kubernetes.io/projected/a85fb28f-af43-4d9b-acf3-71c4f4494daa-kube-api-access-59kb8\") pod \"dnsmasq-dns-8b5c85b87-s745j\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.077430 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.084350 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8nl\" (UniqueName: \"kubernetes.io/projected/a29ce001-6800-43c1-9b4d-24be729f85b8-kube-api-access-jc8nl\") pod \"glance-default-internal-api-0\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.084839 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:10 crc kubenswrapper[4675]: W0320 16:22:10.093354 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bec8a0d_764c_4502_a128_6e03564486f7.slice/crio-9a18c5944a225a3c90194afb5077fcdfe1514ac7b87ff4c5112291677f3e8fd1 WatchSource:0}: Error finding container 9a18c5944a225a3c90194afb5077fcdfe1514ac7b87ff4c5112291677f3e8fd1: Status 404 returned error can't find the container with id 9a18c5944a225a3c90194afb5077fcdfe1514ac7b87ff4c5112291677f3e8fd1 Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.240535 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bf9f9b95c-kgcr5"] Mar 20 16:22:10 crc kubenswrapper[4675]: W0320 16:22:10.258741 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd94a2d76_92e2_4403_ad6d_e2124b400d78.slice/crio-5ca707fa6b221436887a3e61b4dd7600c7645d1480ab8b43494fabbc82e9f413 WatchSource:0}: Error finding container 5ca707fa6b221436887a3e61b4dd7600c7645d1480ab8b43494fabbc82e9f413: Status 404 returned error can't find the container with id 5ca707fa6b221436887a3e61b4dd7600c7645d1480ab8b43494fabbc82e9f413 Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.337327 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.418479 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hhfc9"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.443683 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf9f9b95c-kgcr5" event={"ID":"d94a2d76-92e2-4403-ad6d-e2124b400d78","Type":"ContainerStarted","Data":"5ca707fa6b221436887a3e61b4dd7600c7645d1480ab8b43494fabbc82e9f413"} Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.449378 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncgfd" event={"ID":"1bec8a0d-764c-4502-a128-6e03564486f7","Type":"ContainerStarted","Data":"9a18c5944a225a3c90194afb5077fcdfe1514ac7b87ff4c5112291677f3e8fd1"} Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.454785 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-c89l4"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.488033 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.619734 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ddfbw"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.659860 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tt28r"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.799385 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7tdnv"] Mar 20 16:22:10 crc kubenswrapper[4675]: I0320 16:22:10.813934 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-546bd65fb7-ktfbp"] Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.008902 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-s745j"] Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.101750 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:11 crc kubenswrapper[4675]: W0320 16:22:11.110507 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccaf475c_1819_4acd_ac59_5b4e5bdd1212.slice/crio-3a3ab818c3124b09a9aa9927ddc9750761348965b4190bfb8bd271b50f3927c0 WatchSource:0}: Error finding container 3a3ab818c3124b09a9aa9927ddc9750761348965b4190bfb8bd271b50f3927c0: Status 404 returned error can't find the container with id 3a3ab818c3124b09a9aa9927ddc9750761348965b4190bfb8bd271b50f3927c0 Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.243838 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:11 crc kubenswrapper[4675]: W0320 16:22:11.274395 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29ce001_6800_43c1_9b4d_24be729f85b8.slice/crio-8b7daba08dee46e7f842fc0519b606984dc9270f1be9383a1497f3e5b863290a WatchSource:0}: Error finding container 8b7daba08dee46e7f842fc0519b606984dc9270f1be9383a1497f3e5b863290a: Status 404 returned error can't find the container with id 8b7daba08dee46e7f842fc0519b606984dc9270f1be9383a1497f3e5b863290a Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.556821 4675 generic.go:334] "Generic (PLEG): container finished" podID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerID="577144b8f6a3076b1c84be34d3bb4ee47f990061b33ae0c33245b8c83f1006ed" exitCode=0 Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.557144 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" event={"ID":"a85fb28f-af43-4d9b-acf3-71c4f4494daa","Type":"ContainerDied","Data":"577144b8f6a3076b1c84be34d3bb4ee47f990061b33ae0c33245b8c83f1006ed"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.557175 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" event={"ID":"a85fb28f-af43-4d9b-acf3-71c4f4494daa","Type":"ContainerStarted","Data":"93ec57987c99e373c7454ed8d20b47fd7e13c16cf3013a096f1e6b5f6d86e1ed"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.580065 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ddfbw" event={"ID":"7935d7aa-cb6b-4b66-a58f-31e0cce41114","Type":"ContainerStarted","Data":"9e0089b35c709f2f9ddef36339c3d4968101c7fdf03c76ca09b8329ab9d182f7"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.580117 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ddfbw" event={"ID":"7935d7aa-cb6b-4b66-a58f-31e0cce41114","Type":"ContainerStarted","Data":"55e540a16359bd80376f74367b953322dbcc7c78934a5bdeed6564b4afb4ef9a"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.642305 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7tdnv" event={"ID":"653f25dd-b7f2-4ec1-8569-96af48c4c388","Type":"ContainerStarted","Data":"01ae6f1499eed2aec66a74aa556c358bf859fab0ad3474ce6deb717687a6575b"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.674451 4675 generic.go:334] "Generic (PLEG): container finished" podID="788934a5-2d83-4bbd-a3fa-3deec63c322c" containerID="60c87cbe6275383b7b585e691b1bdb92258b5c0fa6cd05cad0de176a9b3c1558" exitCode=0 Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.674521 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" event={"ID":"788934a5-2d83-4bbd-a3fa-3deec63c322c","Type":"ContainerDied","Data":"60c87cbe6275383b7b585e691b1bdb92258b5c0fa6cd05cad0de176a9b3c1558"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.674547 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" event={"ID":"788934a5-2d83-4bbd-a3fa-3deec63c322c","Type":"ContainerStarted","Data":"93ef6b859de455da47f2aaa4d4ef0663f719f7592308c2a4061f710f9725267c"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.700901 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.714125 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerStarted","Data":"eda9251422a7a196a30fbe6be7185712c179311a552f52b6c1a0f12fdbe7ea58"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.731035 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ddfbw" podStartSLOduration=3.7310078730000003 podStartE2EDuration="3.731007873s" podCreationTimestamp="2026-03-20 16:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:11.674296035 +0000 UTC m=+1251.707925572" watchObservedRunningTime="2026-03-20 16:22:11.731007873 +0000 UTC m=+1251.764637410" Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.832508 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bf9f9b95c-kgcr5"] Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.834515 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncgfd" event={"ID":"1bec8a0d-764c-4502-a128-6e03564486f7","Type":"ContainerStarted","Data":"fd6a66917b7f0e5cf198aacb7bfd72ffce1fefd8c9299f3058986639f90435b2"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.850365 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a29ce001-6800-43c1-9b4d-24be729f85b8","Type":"ContainerStarted","Data":"8b7daba08dee46e7f842fc0519b606984dc9270f1be9383a1497f3e5b863290a"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.908383 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt28r" event={"ID":"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43","Type":"ContainerStarted","Data":"7afe2c666ee5b6c44f7514ae19b7f082130de6a94648181d09da106366ddbf16"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.930647 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b9b9d8b55-gbspf"] Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.930807 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ncgfd" podStartSLOduration=3.930790632 podStartE2EDuration="3.930790632s" podCreationTimestamp="2026-03-20 16:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:11.891786847 +0000 UTC m=+1251.925416384" watchObservedRunningTime="2026-03-20 16:22:11.930790632 +0000 UTC m=+1251.964420159" Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.932246 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.950017 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-546bd65fb7-ktfbp" event={"ID":"b282cfa4-8448-4eef-8463-fca67d9608fd","Type":"ContainerStarted","Data":"87cbee3a0af82bfda61fe02012097ad1957713acb71b9c41eb26f72620b31922"} Mar 20 16:22:11 crc kubenswrapper[4675]: I0320 16:22:11.995149 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:11.999915 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccaf475c-1819-4acd-ac59-5b4e5bdd1212","Type":"ContainerStarted","Data":"3a3ab818c3124b09a9aa9927ddc9750761348965b4190bfb8bd271b50f3927c0"} Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.007242 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hhfc9" event={"ID":"c1560aa0-d06c-4c98-80bf-0635065cac6f","Type":"ContainerStarted","Data":"9ae74317edd09801769caa5ebd25dafd2ffdb7b049cc42a62d9ad7d4280276ee"} Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.040741 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b9b9d8b55-gbspf"] Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.128155 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.135604 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-scripts\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.135667 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl9m\" (UniqueName: \"kubernetes.io/projected/85c227a3-b831-440b-ab1a-4171217faf81-kube-api-access-6pl9m\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.135717 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85c227a3-b831-440b-ab1a-4171217faf81-horizon-secret-key\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.135741 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-config-data\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.135793 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c227a3-b831-440b-ab1a-4171217faf81-logs\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.238504 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-scripts\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.238590 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl9m\" (UniqueName: \"kubernetes.io/projected/85c227a3-b831-440b-ab1a-4171217faf81-kube-api-access-6pl9m\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.238677 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85c227a3-b831-440b-ab1a-4171217faf81-horizon-secret-key\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.238701 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-config-data\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.238756 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c227a3-b831-440b-ab1a-4171217faf81-logs\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.239481 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c227a3-b831-440b-ab1a-4171217faf81-logs\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.241833 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-scripts\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.243908 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-config-data\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.281915 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl9m\" (UniqueName: \"kubernetes.io/projected/85c227a3-b831-440b-ab1a-4171217faf81-kube-api-access-6pl9m\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.284003 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85c227a3-b831-440b-ab1a-4171217faf81-horizon-secret-key\") pod \"horizon-7b9b9d8b55-gbspf\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.475167 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.573031 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.653470 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-config\") pod \"788934a5-2d83-4bbd-a3fa-3deec63c322c\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.653841 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-swift-storage-0\") pod \"788934a5-2d83-4bbd-a3fa-3deec63c322c\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.653961 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-svc\") pod \"788934a5-2d83-4bbd-a3fa-3deec63c322c\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.653993 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-nb\") pod \"788934a5-2d83-4bbd-a3fa-3deec63c322c\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.654031 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-sb\") pod \"788934a5-2d83-4bbd-a3fa-3deec63c322c\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.654062 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ftz6\" (UniqueName: \"kubernetes.io/projected/788934a5-2d83-4bbd-a3fa-3deec63c322c-kube-api-access-4ftz6\") pod \"788934a5-2d83-4bbd-a3fa-3deec63c322c\" (UID: \"788934a5-2d83-4bbd-a3fa-3deec63c322c\") " Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.661471 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788934a5-2d83-4bbd-a3fa-3deec63c322c-kube-api-access-4ftz6" (OuterVolumeSpecName: "kube-api-access-4ftz6") pod "788934a5-2d83-4bbd-a3fa-3deec63c322c" (UID: "788934a5-2d83-4bbd-a3fa-3deec63c322c"). InnerVolumeSpecName "kube-api-access-4ftz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.736888 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-config" (OuterVolumeSpecName: "config") pod "788934a5-2d83-4bbd-a3fa-3deec63c322c" (UID: "788934a5-2d83-4bbd-a3fa-3deec63c322c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.743646 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "788934a5-2d83-4bbd-a3fa-3deec63c322c" (UID: "788934a5-2d83-4bbd-a3fa-3deec63c322c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.751629 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "788934a5-2d83-4bbd-a3fa-3deec63c322c" (UID: "788934a5-2d83-4bbd-a3fa-3deec63c322c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.755980 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "788934a5-2d83-4bbd-a3fa-3deec63c322c" (UID: "788934a5-2d83-4bbd-a3fa-3deec63c322c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.756917 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "788934a5-2d83-4bbd-a3fa-3deec63c322c" (UID: "788934a5-2d83-4bbd-a3fa-3deec63c322c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.762055 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.762276 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.762290 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.762300 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.762312 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ftz6\" (UniqueName: \"kubernetes.io/projected/788934a5-2d83-4bbd-a3fa-3deec63c322c-kube-api-access-4ftz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:12 crc kubenswrapper[4675]: I0320 16:22:12.762325 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788934a5-2d83-4bbd-a3fa-3deec63c322c-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.053372 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccaf475c-1819-4acd-ac59-5b4e5bdd1212","Type":"ContainerStarted","Data":"97c04a750576bd75f5d8b5a39dadbe349a69e3f6a807976a4c06126be0395496"} Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.068602 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" event={"ID":"a85fb28f-af43-4d9b-acf3-71c4f4494daa","Type":"ContainerStarted","Data":"f61f12a86cdebbbb94dfda6cf319ec5fb54b82b18f966819d172e491c5e172f5"} Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.071414 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.103607 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a29ce001-6800-43c1-9b4d-24be729f85b8","Type":"ContainerStarted","Data":"fb69cf076750f7c9bd61dd2d02d329ffd48ccc53bc41a06aeacdecbc0849be38"} Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.105239 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" podStartSLOduration=4.105221049 podStartE2EDuration="4.105221049s" podCreationTimestamp="2026-03-20 16:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:13.101380592 +0000 UTC m=+1253.135010129" watchObservedRunningTime="2026-03-20 16:22:13.105221049 +0000 UTC m=+1253.138850586" Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.126497 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.141580 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-c89l4" event={"ID":"788934a5-2d83-4bbd-a3fa-3deec63c322c","Type":"ContainerDied","Data":"93ef6b859de455da47f2aaa4d4ef0663f719f7592308c2a4061f710f9725267c"} Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.147834 4675 scope.go:117] "RemoveContainer" containerID="60c87cbe6275383b7b585e691b1bdb92258b5c0fa6cd05cad0de176a9b3c1558" Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.261354 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-c89l4"] Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.289429 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-c89l4"] Mar 20 16:22:13 crc kubenswrapper[4675]: I0320 16:22:13.301396 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b9b9d8b55-gbspf"] Mar 20 16:22:14 crc kubenswrapper[4675]: I0320 16:22:14.137932 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccaf475c-1819-4acd-ac59-5b4e5bdd1212","Type":"ContainerStarted","Data":"470030f1534ccb6be36e7a43d679e8315e0eb019c32dae12e36c562788557f78"} Mar 20 16:22:14 crc kubenswrapper[4675]: I0320 16:22:14.138390 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-log" containerID="cri-o://97c04a750576bd75f5d8b5a39dadbe349a69e3f6a807976a4c06126be0395496" gracePeriod=30 Mar 20 16:22:14 crc kubenswrapper[4675]: I0320 16:22:14.138951 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-httpd" containerID="cri-o://470030f1534ccb6be36e7a43d679e8315e0eb019c32dae12e36c562788557f78" gracePeriod=30 Mar 20 16:22:14 crc kubenswrapper[4675]: I0320 16:22:14.141036 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9b9d8b55-gbspf" event={"ID":"85c227a3-b831-440b-ab1a-4171217faf81","Type":"ContainerStarted","Data":"e5a7403e6a3dec7b0efd0423e6523937c2605b8ecc7f623efe3a05e7d4548aea"} Mar 20 16:22:14 crc kubenswrapper[4675]: I0320 16:22:14.163736 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.163710351 podStartE2EDuration="5.163710351s" podCreationTimestamp="2026-03-20 16:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:14.161465308 +0000 UTC m=+1254.195094855" watchObservedRunningTime="2026-03-20 16:22:14.163710351 +0000 UTC m=+1254.197339888" Mar 20 16:22:14 crc kubenswrapper[4675]: I0320 16:22:14.689229 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788934a5-2d83-4bbd-a3fa-3deec63c322c" path="/var/lib/kubelet/pods/788934a5-2d83-4bbd-a3fa-3deec63c322c/volumes" Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.175223 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a29ce001-6800-43c1-9b4d-24be729f85b8","Type":"ContainerStarted","Data":"9f37a56a3bcc72e943f46014e0b4fdf1e086b73ac973167b466517aac66a3907"} Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.175298 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-log" containerID="cri-o://fb69cf076750f7c9bd61dd2d02d329ffd48ccc53bc41a06aeacdecbc0849be38" gracePeriod=30 Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.175377 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-httpd" containerID="cri-o://9f37a56a3bcc72e943f46014e0b4fdf1e086b73ac973167b466517aac66a3907" gracePeriod=30 Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.183956 4675 generic.go:334] "Generic (PLEG): container finished" podID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerID="470030f1534ccb6be36e7a43d679e8315e0eb019c32dae12e36c562788557f78" exitCode=143 Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.183989 4675 generic.go:334] "Generic (PLEG): container finished" podID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerID="97c04a750576bd75f5d8b5a39dadbe349a69e3f6a807976a4c06126be0395496" exitCode=143 Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.184147 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccaf475c-1819-4acd-ac59-5b4e5bdd1212","Type":"ContainerDied","Data":"470030f1534ccb6be36e7a43d679e8315e0eb019c32dae12e36c562788557f78"} Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.184242 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccaf475c-1819-4acd-ac59-5b4e5bdd1212","Type":"ContainerDied","Data":"97c04a750576bd75f5d8b5a39dadbe349a69e3f6a807976a4c06126be0395496"} Mar 20 16:22:15 crc kubenswrapper[4675]: I0320 16:22:15.218144 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.216603107 podStartE2EDuration="6.216603107s" podCreationTimestamp="2026-03-20 16:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:15.19336027 +0000 UTC m=+1255.226989817" watchObservedRunningTime="2026-03-20 16:22:15.216603107 +0000 UTC m=+1255.250232644" Mar 20 16:22:16 crc kubenswrapper[4675]: I0320 16:22:16.198421 4675 generic.go:334] "Generic (PLEG): container finished" podID="1bec8a0d-764c-4502-a128-6e03564486f7" containerID="fd6a66917b7f0e5cf198aacb7bfd72ffce1fefd8c9299f3058986639f90435b2" exitCode=0 Mar 20 16:22:16 crc kubenswrapper[4675]: I0320 16:22:16.198512 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncgfd" event={"ID":"1bec8a0d-764c-4502-a128-6e03564486f7","Type":"ContainerDied","Data":"fd6a66917b7f0e5cf198aacb7bfd72ffce1fefd8c9299f3058986639f90435b2"} Mar 20 16:22:16 crc kubenswrapper[4675]: I0320 16:22:16.202663 4675 generic.go:334] "Generic (PLEG): container finished" podID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerID="9f37a56a3bcc72e943f46014e0b4fdf1e086b73ac973167b466517aac66a3907" exitCode=0 Mar 20 16:22:16 crc kubenswrapper[4675]: I0320 16:22:16.202690 4675 generic.go:334] "Generic (PLEG): container finished" podID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerID="fb69cf076750f7c9bd61dd2d02d329ffd48ccc53bc41a06aeacdecbc0849be38" exitCode=143 Mar 20 16:22:16 crc kubenswrapper[4675]: I0320 16:22:16.202732 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a29ce001-6800-43c1-9b4d-24be729f85b8","Type":"ContainerDied","Data":"9f37a56a3bcc72e943f46014e0b4fdf1e086b73ac973167b466517aac66a3907"} Mar 20 16:22:16 crc kubenswrapper[4675]: I0320 16:22:16.202760 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a29ce001-6800-43c1-9b4d-24be729f85b8","Type":"ContainerDied","Data":"fb69cf076750f7c9bd61dd2d02d329ffd48ccc53bc41a06aeacdecbc0849be38"} Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.195318 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.213983 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.214965 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccaf475c-1819-4acd-ac59-5b4e5bdd1212","Type":"ContainerDied","Data":"3a3ab818c3124b09a9aa9927ddc9750761348965b4190bfb8bd271b50f3927c0"} Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.215006 4675 scope.go:117] "RemoveContainer" containerID="470030f1534ccb6be36e7a43d679e8315e0eb019c32dae12e36c562788557f78" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288113 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-combined-ca-bundle\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288158 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-logs\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288225 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-scripts\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288249 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-public-tls-certs\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288273 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-httpd-run\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288304 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-config-data\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288321 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2l4r\" (UniqueName: \"kubernetes.io/projected/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-kube-api-access-v2l4r\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288348 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\" (UID: \"ccaf475c-1819-4acd-ac59-5b4e5bdd1212\") " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.288813 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-logs" (OuterVolumeSpecName: "logs") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.289115 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.294049 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-scripts" (OuterVolumeSpecName: "scripts") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.311153 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.312846 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-kube-api-access-v2l4r" (OuterVolumeSpecName: "kube-api-access-v2l4r") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "kube-api-access-v2l4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.324995 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.351302 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-config-data" (OuterVolumeSpecName: "config-data") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.384413 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ccaf475c-1819-4acd-ac59-5b4e5bdd1212" (UID: "ccaf475c-1819-4acd-ac59-5b4e5bdd1212"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395293 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395338 4675 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395352 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395363 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395375 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2l4r\" (UniqueName: \"kubernetes.io/projected/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-kube-api-access-v2l4r\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395413 4675 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395427 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.395437 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccaf475c-1819-4acd-ac59-5b4e5bdd1212-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.430829 4675 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.497443 4675 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.550179 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.615401 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.625651 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:17 crc kubenswrapper[4675]: E0320 16:22:17.626077 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-log" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.626096 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-log" Mar 20 16:22:17 crc kubenswrapper[4675]: E0320 16:22:17.626111 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788934a5-2d83-4bbd-a3fa-3deec63c322c" containerName="init" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.626117 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="788934a5-2d83-4bbd-a3fa-3deec63c322c" containerName="init" Mar 20 16:22:17 crc kubenswrapper[4675]: E0320 16:22:17.626128 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-httpd" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.626134 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-httpd" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.626298 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="788934a5-2d83-4bbd-a3fa-3deec63c322c" containerName="init" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.626317 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-log" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.626328 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" containerName="glance-httpd" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.627218 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.629065 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.629289 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.635246 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.806685 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.806735 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.806789 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.806845 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.806864 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqq5\" (UniqueName: \"kubernetes.io/projected/e4e0a711-d2ca-451c-8327-1a045ac918e4-kube-api-access-6mqq5\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.807040 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-logs\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.807114 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.807202 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909544 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909610 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909656 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909705 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909732 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqq5\" (UniqueName: \"kubernetes.io/projected/e4e0a711-d2ca-451c-8327-1a045ac918e4-kube-api-access-6mqq5\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909790 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-logs\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909829 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.909873 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.910833 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.911354 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-logs\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.912865 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.917555 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.917800 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.918002 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.919614 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.933366 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqq5\" (UniqueName: \"kubernetes.io/projected/e4e0a711-d2ca-451c-8327-1a045ac918e4-kube-api-access-6mqq5\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.940669 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:17 crc kubenswrapper[4675]: I0320 16:22:17.951833 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.346298 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-546bd65fb7-ktfbp"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.378450 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c454cc68b-lmjfb"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.381167 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.386518 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.391292 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.401486 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c454cc68b-lmjfb"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.469915 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b9b9d8b55-gbspf"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.484606 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d79d4db6d-vnw9g"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.487302 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.497685 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d79d4db6d-vnw9g"] Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.519811 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-tls-certs\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.519885 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-scripts\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.519921 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e972387-c641-42bd-9c3f-69fc70869c8a-logs\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.519937 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-combined-ca-bundle\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.519954 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-secret-key\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.519975 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-config-data\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.520046 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7rf\" (UniqueName: \"kubernetes.io/projected/3e972387-c641-42bd-9c3f-69fc70869c8a-kube-api-access-2b7rf\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.621881 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-tls-certs\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.621941 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-combined-ca-bundle\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.621984 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81ab73e-24ce-451b-9064-d6ebea2c5976-logs\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622074 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-scripts\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622171 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e972387-c641-42bd-9c3f-69fc70869c8a-logs\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622201 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-combined-ca-bundle\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622226 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-secret-key\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622260 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k84l5\" (UniqueName: \"kubernetes.io/projected/b81ab73e-24ce-451b-9064-d6ebea2c5976-kube-api-access-k84l5\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622285 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-config-data\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622329 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-horizon-tls-certs\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622399 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7rf\" (UniqueName: \"kubernetes.io/projected/3e972387-c641-42bd-9c3f-69fc70869c8a-kube-api-access-2b7rf\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622436 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-horizon-secret-key\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622467 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81ab73e-24ce-451b-9064-d6ebea2c5976-scripts\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622493 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b81ab73e-24ce-451b-9064-d6ebea2c5976-config-data\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.622916 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e972387-c641-42bd-9c3f-69fc70869c8a-logs\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.623376 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-scripts\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.626962 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-combined-ca-bundle\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.627253 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-tls-certs\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.629394 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-config-data\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.632229 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-secret-key\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.648678 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7rf\" (UniqueName: \"kubernetes.io/projected/3e972387-c641-42bd-9c3f-69fc70869c8a-kube-api-access-2b7rf\") pod \"horizon-7c454cc68b-lmjfb\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.694646 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccaf475c-1819-4acd-ac59-5b4e5bdd1212" path="/var/lib/kubelet/pods/ccaf475c-1819-4acd-ac59-5b4e5bdd1212/volumes" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.704352 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.723889 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-combined-ca-bundle\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.723947 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81ab73e-24ce-451b-9064-d6ebea2c5976-logs\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.724000 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k84l5\" (UniqueName: \"kubernetes.io/projected/b81ab73e-24ce-451b-9064-d6ebea2c5976-kube-api-access-k84l5\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.724038 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-horizon-tls-certs\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.724099 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-horizon-secret-key\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.724123 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b81ab73e-24ce-451b-9064-d6ebea2c5976-config-data\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.724137 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81ab73e-24ce-451b-9064-d6ebea2c5976-scripts\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.725148 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81ab73e-24ce-451b-9064-d6ebea2c5976-scripts\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.726526 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81ab73e-24ce-451b-9064-d6ebea2c5976-logs\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.726879 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b81ab73e-24ce-451b-9064-d6ebea2c5976-config-data\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.732703 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-horizon-secret-key\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.732744 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-combined-ca-bundle\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.733448 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b81ab73e-24ce-451b-9064-d6ebea2c5976-horizon-tls-certs\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.740367 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k84l5\" (UniqueName: \"kubernetes.io/projected/b81ab73e-24ce-451b-9064-d6ebea2c5976-kube-api-access-k84l5\") pod \"horizon-5d79d4db6d-vnw9g\" (UID: \"b81ab73e-24ce-451b-9064-d6ebea2c5976\") " pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:18 crc kubenswrapper[4675]: I0320 16:22:18.821799 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:20 crc kubenswrapper[4675]: I0320 16:22:20.086941 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:20 crc kubenswrapper[4675]: I0320 16:22:20.180956 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7shgn"] Mar 20 16:22:20 crc kubenswrapper[4675]: I0320 16:22:20.181272 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" containerID="cri-o://c72d885b2d21c9cbe9ddbeac4ec5c9e97da730c57c9f2a83cd129fe6a71f4ea0" gracePeriod=10 Mar 20 16:22:21 crc kubenswrapper[4675]: I0320 16:22:21.268527 4675 generic.go:334] "Generic (PLEG): container finished" podID="06dfefff-e695-44b8-bb74-10ef9a589184" containerID="c72d885b2d21c9cbe9ddbeac4ec5c9e97da730c57c9f2a83cd129fe6a71f4ea0" exitCode=0 Mar 20 16:22:21 crc kubenswrapper[4675]: I0320 16:22:21.268657 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" event={"ID":"06dfefff-e695-44b8-bb74-10ef9a589184","Type":"ContainerDied","Data":"c72d885b2d21c9cbe9ddbeac4ec5c9e97da730c57c9f2a83cd129fe6a71f4ea0"} Mar 20 16:22:22 crc kubenswrapper[4675]: I0320 16:22:22.638556 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Mar 20 16:22:25 crc kubenswrapper[4675]: E0320 16:22:25.198508 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 20 16:22:25 crc kubenswrapper[4675]: E0320 16:22:25.199146 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l6hz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-7tdnv_openstack(653f25dd-b7f2-4ec1-8569-96af48c4c388): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:22:25 crc kubenswrapper[4675]: E0320 16:22:25.200968 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-7tdnv" podUID="653f25dd-b7f2-4ec1-8569-96af48c4c388" Mar 20 16:22:25 crc kubenswrapper[4675]: E0320 16:22:25.302353 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-7tdnv" podUID="653f25dd-b7f2-4ec1-8569-96af48c4c388" Mar 20 16:22:27 crc kubenswrapper[4675]: I0320 16:22:27.639321 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.663346 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.769124 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-credential-keys\") pod \"1bec8a0d-764c-4502-a128-6e03564486f7\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.769480 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsl9t\" (UniqueName: \"kubernetes.io/projected/1bec8a0d-764c-4502-a128-6e03564486f7-kube-api-access-gsl9t\") pod \"1bec8a0d-764c-4502-a128-6e03564486f7\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.769587 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-scripts\") pod \"1bec8a0d-764c-4502-a128-6e03564486f7\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.769622 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-fernet-keys\") pod \"1bec8a0d-764c-4502-a128-6e03564486f7\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.769688 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-config-data\") pod \"1bec8a0d-764c-4502-a128-6e03564486f7\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.769811 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-combined-ca-bundle\") pod \"1bec8a0d-764c-4502-a128-6e03564486f7\" (UID: \"1bec8a0d-764c-4502-a128-6e03564486f7\") " Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.776396 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1bec8a0d-764c-4502-a128-6e03564486f7" (UID: "1bec8a0d-764c-4502-a128-6e03564486f7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.779256 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1bec8a0d-764c-4502-a128-6e03564486f7" (UID: "1bec8a0d-764c-4502-a128-6e03564486f7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.780348 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bec8a0d-764c-4502-a128-6e03564486f7-kube-api-access-gsl9t" (OuterVolumeSpecName: "kube-api-access-gsl9t") pod "1bec8a0d-764c-4502-a128-6e03564486f7" (UID: "1bec8a0d-764c-4502-a128-6e03564486f7"). InnerVolumeSpecName "kube-api-access-gsl9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.783881 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-scripts" (OuterVolumeSpecName: "scripts") pod "1bec8a0d-764c-4502-a128-6e03564486f7" (UID: "1bec8a0d-764c-4502-a128-6e03564486f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.795646 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bec8a0d-764c-4502-a128-6e03564486f7" (UID: "1bec8a0d-764c-4502-a128-6e03564486f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.811581 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-config-data" (OuterVolumeSpecName: "config-data") pod "1bec8a0d-764c-4502-a128-6e03564486f7" (UID: "1bec8a0d-764c-4502-a128-6e03564486f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.873282 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.873316 4675 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.873325 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsl9t\" (UniqueName: \"kubernetes.io/projected/1bec8a0d-764c-4502-a128-6e03564486f7-kube-api-access-gsl9t\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.873357 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.873367 4675 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4675]: I0320 16:22:29.873375 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bec8a0d-764c-4502-a128-6e03564486f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.342307 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ncgfd" event={"ID":"1bec8a0d-764c-4502-a128-6e03564486f7","Type":"ContainerDied","Data":"9a18c5944a225a3c90194afb5077fcdfe1514ac7b87ff4c5112291677f3e8fd1"} Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.342351 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a18c5944a225a3c90194afb5077fcdfe1514ac7b87ff4c5112291677f3e8fd1" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.342380 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ncgfd" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.740068 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ncgfd"] Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.747162 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ncgfd"] Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.846100 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zlp5c"] Mar 20 16:22:30 crc kubenswrapper[4675]: E0320 16:22:30.846852 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bec8a0d-764c-4502-a128-6e03564486f7" containerName="keystone-bootstrap" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.846868 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bec8a0d-764c-4502-a128-6e03564486f7" containerName="keystone-bootstrap" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.847392 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bec8a0d-764c-4502-a128-6e03564486f7" containerName="keystone-bootstrap" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.848462 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.850625 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk87c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.850673 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.850783 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.850902 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.850995 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.858286 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zlp5c"] Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.997020 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-scripts\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.997105 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-config-data\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.997234 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x287\" (UniqueName: \"kubernetes.io/projected/de350b0e-5712-4f65-b01b-27814457bee4-kube-api-access-6x287\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.997324 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-fernet-keys\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.997367 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-credential-keys\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:30 crc kubenswrapper[4675]: I0320 16:22:30.997394 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-combined-ca-bundle\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.098964 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x287\" (UniqueName: \"kubernetes.io/projected/de350b0e-5712-4f65-b01b-27814457bee4-kube-api-access-6x287\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.099055 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-fernet-keys\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.099093 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-credential-keys\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.099114 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-combined-ca-bundle\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.099135 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-scripts\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.099157 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-config-data\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.104606 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-credential-keys\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.104741 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-combined-ca-bundle\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.104801 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-config-data\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.105173 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-fernet-keys\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.105951 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-scripts\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.113312 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x287\" (UniqueName: \"kubernetes.io/projected/de350b0e-5712-4f65-b01b-27814457bee4-kube-api-access-6x287\") pod \"keystone-bootstrap-zlp5c\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:31 crc kubenswrapper[4675]: I0320 16:22:31.168860 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:32 crc kubenswrapper[4675]: I0320 16:22:32.778726 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bec8a0d-764c-4502-a128-6e03564486f7" path="/var/lib/kubelet/pods/1bec8a0d-764c-4502-a128-6e03564486f7/volumes" Mar 20 16:22:33 crc kubenswrapper[4675]: I0320 16:22:33.778220 4675 generic.go:334] "Generic (PLEG): container finished" podID="7935d7aa-cb6b-4b66-a58f-31e0cce41114" containerID="9e0089b35c709f2f9ddef36339c3d4968101c7fdf03c76ca09b8329ab9d182f7" exitCode=0 Mar 20 16:22:33 crc kubenswrapper[4675]: I0320 16:22:33.778259 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ddfbw" event={"ID":"7935d7aa-cb6b-4b66-a58f-31e0cce41114","Type":"ContainerDied","Data":"9e0089b35c709f2f9ddef36339c3d4968101c7fdf03c76ca09b8329ab9d182f7"} Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.113326 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229033 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-combined-ca-bundle\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229382 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-logs\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229422 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-httpd-run\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229441 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-internal-tls-certs\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229527 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229563 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8nl\" (UniqueName: \"kubernetes.io/projected/a29ce001-6800-43c1-9b4d-24be729f85b8-kube-api-access-jc8nl\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229664 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-scripts\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229693 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-config-data\") pod \"a29ce001-6800-43c1-9b4d-24be729f85b8\" (UID: \"a29ce001-6800-43c1-9b4d-24be729f85b8\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229938 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-logs" (OuterVolumeSpecName: "logs") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.229969 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.230444 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.230472 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a29ce001-6800-43c1-9b4d-24be729f85b8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.235429 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.236306 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29ce001-6800-43c1-9b4d-24be729f85b8-kube-api-access-jc8nl" (OuterVolumeSpecName: "kube-api-access-jc8nl") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "kube-api-access-jc8nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.238908 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-scripts" (OuterVolumeSpecName: "scripts") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.265910 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.275537 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.280669 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-config-data" (OuterVolumeSpecName: "config-data") pod "a29ce001-6800-43c1-9b4d-24be729f85b8" (UID: "a29ce001-6800-43c1-9b4d-24be729f85b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.332222 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.332250 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.332264 4675 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.332293 4675 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.332302 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8nl\" (UniqueName: \"kubernetes.io/projected/a29ce001-6800-43c1-9b4d-24be729f85b8-kube-api-access-jc8nl\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.332311 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29ce001-6800-43c1-9b4d-24be729f85b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.350152 4675 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.433459 4675 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.619977 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.620044 4675 scope.go:117] "RemoveContainer" containerID="97c04a750576bd75f5d8b5a39dadbe349a69e3f6a807976a4c06126be0395496" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.620143 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5927,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hhfc9_openstack(c1560aa0-d06c-4c98-80bf-0635065cac6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.621224 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hhfc9" podUID="c1560aa0-d06c-4c98-80bf-0635065cac6f" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.634537 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.641132 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.737892 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-config\") pod \"06dfefff-e695-44b8-bb74-10ef9a589184\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738219 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-nb\") pod \"06dfefff-e695-44b8-bb74-10ef9a589184\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738261 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxt4\" (UniqueName: \"kubernetes.io/projected/06dfefff-e695-44b8-bb74-10ef9a589184-kube-api-access-kmxt4\") pod \"06dfefff-e695-44b8-bb74-10ef9a589184\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738334 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-swift-storage-0\") pod \"06dfefff-e695-44b8-bb74-10ef9a589184\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738437 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-sb\") pod \"06dfefff-e695-44b8-bb74-10ef9a589184\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738463 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-svc\") pod \"06dfefff-e695-44b8-bb74-10ef9a589184\" (UID: \"06dfefff-e695-44b8-bb74-10ef9a589184\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738478 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnb27\" (UniqueName: \"kubernetes.io/projected/7935d7aa-cb6b-4b66-a58f-31e0cce41114-kube-api-access-xnb27\") pod \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738497 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-combined-ca-bundle\") pod \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.738521 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-config\") pod \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\" (UID: \"7935d7aa-cb6b-4b66-a58f-31e0cce41114\") " Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.743356 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7935d7aa-cb6b-4b66-a58f-31e0cce41114-kube-api-access-xnb27" (OuterVolumeSpecName: "kube-api-access-xnb27") pod "7935d7aa-cb6b-4b66-a58f-31e0cce41114" (UID: "7935d7aa-cb6b-4b66-a58f-31e0cce41114"). InnerVolumeSpecName "kube-api-access-xnb27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.744173 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06dfefff-e695-44b8-bb74-10ef9a589184-kube-api-access-kmxt4" (OuterVolumeSpecName: "kube-api-access-kmxt4") pod "06dfefff-e695-44b8-bb74-10ef9a589184" (UID: "06dfefff-e695-44b8-bb74-10ef9a589184"). InnerVolumeSpecName "kube-api-access-kmxt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.750188 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.750273 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.763396 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-config" (OuterVolumeSpecName: "config") pod "7935d7aa-cb6b-4b66-a58f-31e0cce41114" (UID: "7935d7aa-cb6b-4b66-a58f-31e0cce41114"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.771113 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7935d7aa-cb6b-4b66-a58f-31e0cce41114" (UID: "7935d7aa-cb6b-4b66-a58f-31e0cce41114"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.778792 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06dfefff-e695-44b8-bb74-10ef9a589184" (UID: "06dfefff-e695-44b8-bb74-10ef9a589184"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.781299 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06dfefff-e695-44b8-bb74-10ef9a589184" (UID: "06dfefff-e695-44b8-bb74-10ef9a589184"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.784600 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06dfefff-e695-44b8-bb74-10ef9a589184" (UID: "06dfefff-e695-44b8-bb74-10ef9a589184"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.785551 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-config" (OuterVolumeSpecName: "config") pod "06dfefff-e695-44b8-bb74-10ef9a589184" (UID: "06dfefff-e695-44b8-bb74-10ef9a589184"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.787460 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "06dfefff-e695-44b8-bb74-10ef9a589184" (UID: "06dfefff-e695-44b8-bb74-10ef9a589184"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.809128 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a29ce001-6800-43c1-9b4d-24be729f85b8","Type":"ContainerDied","Data":"8b7daba08dee46e7f842fc0519b606984dc9270f1be9383a1497f3e5b863290a"} Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.809173 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.813398 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" event={"ID":"06dfefff-e695-44b8-bb74-10ef9a589184","Type":"ContainerDied","Data":"682331a98e7cf87c23e5479cbd43866486e48afa51d2793cd56dd1badbcb2152"} Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.813598 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-7shgn" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.816333 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ddfbw" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.816412 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ddfbw" event={"ID":"7935d7aa-cb6b-4b66-a58f-31e0cce41114","Type":"ContainerDied","Data":"55e540a16359bd80376f74367b953322dbcc7c78934a5bdeed6564b4afb4ef9a"} Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.816946 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55e540a16359bd80376f74367b953322dbcc7c78934a5bdeed6564b4afb4ef9a" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.817492 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hhfc9" podUID="c1560aa0-d06c-4c98-80bf-0635065cac6f" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840316 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840342 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840353 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxt4\" (UniqueName: \"kubernetes.io/projected/06dfefff-e695-44b8-bb74-10ef9a589184-kube-api-access-kmxt4\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840363 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840371 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840379 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06dfefff-e695-44b8-bb74-10ef9a589184-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840387 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnb27\" (UniqueName: \"kubernetes.io/projected/7935d7aa-cb6b-4b66-a58f-31e0cce41114-kube-api-access-xnb27\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840395 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.840403 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7935d7aa-cb6b-4b66-a58f-31e0cce41114-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.872480 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.888987 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.902992 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7shgn"] Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.910186 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-7shgn"] Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.917140 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.917586 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="init" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.917608 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="init" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.917632 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-httpd" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.917642 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-httpd" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.917654 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.917663 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.917678 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7935d7aa-cb6b-4b66-a58f-31e0cce41114" containerName="neutron-db-sync" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.917687 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7935d7aa-cb6b-4b66-a58f-31e0cce41114" containerName="neutron-db-sync" Mar 20 16:22:37 crc kubenswrapper[4675]: E0320 16:22:37.917707 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-log" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.917714 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-log" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.918210 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7935d7aa-cb6b-4b66-a58f-31e0cce41114" containerName="neutron-db-sync" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.918232 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-log" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.918245 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" containerName="dnsmasq-dns" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.918256 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" containerName="glance-httpd" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.919238 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.921751 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.922054 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 16:22:37 crc kubenswrapper[4675]: I0320 16:22:37.924887 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.045796 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.045883 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.045916 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.045943 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.045996 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db7g4\" (UniqueName: \"kubernetes.io/projected/bb4f4c66-7029-49ad-aa71-38faa62d3178-kube-api-access-db7g4\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.046091 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.046178 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.046248 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147361 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db7g4\" (UniqueName: \"kubernetes.io/projected/bb4f4c66-7029-49ad-aa71-38faa62d3178-kube-api-access-db7g4\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147480 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147542 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147600 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147624 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147676 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147694 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.147712 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.148126 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.148306 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.148405 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.151056 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.152157 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.152309 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.155344 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.165549 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db7g4\" (UniqueName: \"kubernetes.io/projected/bb4f4c66-7029-49ad-aa71-38faa62d3178-kube-api-access-db7g4\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.176578 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.238302 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.682327 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06dfefff-e695-44b8-bb74-10ef9a589184" path="/var/lib/kubelet/pods/06dfefff-e695-44b8-bb74-10ef9a589184/volumes" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.682989 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29ce001-6800-43c1-9b4d-24be729f85b8" path="/var/lib/kubelet/pods/a29ce001-6800-43c1-9b4d-24be729f85b8/volumes" Mar 20 16:22:38 crc kubenswrapper[4675]: E0320 16:22:38.920280 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 16:22:38 crc kubenswrapper[4675]: E0320 16:22:38.920761 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zkck2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-tt28r_openstack(37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:22:38 crc kubenswrapper[4675]: E0320 16:22:38.922218 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-tt28r" podUID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.933112 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-xxh7q"] Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.937024 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:38 crc kubenswrapper[4675]: I0320 16:22:38.956160 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-xxh7q"] Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.070685 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-865b456f44-shc9z"] Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.071809 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.071877 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-config\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.071900 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8ztp\" (UniqueName: \"kubernetes.io/projected/6038548b-5443-43f8-adec-c61ed20a6c2f-kube-api-access-t8ztp\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.071956 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.072008 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.072041 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.072100 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.074471 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.076577 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.076727 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.076909 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pch7c" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.093714 4675 scope.go:117] "RemoveContainer" containerID="9f37a56a3bcc72e943f46014e0b4fdf1e086b73ac973167b466517aac66a3907" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.098038 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-865b456f44-shc9z"] Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173104 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-ovndb-tls-certs\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173368 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173423 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173463 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-httpd-config\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173488 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-config\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173514 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173553 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-config\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173582 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8ztp\" (UniqueName: \"kubernetes.io/projected/6038548b-5443-43f8-adec-c61ed20a6c2f-kube-api-access-t8ztp\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173621 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxf4\" (UniqueName: \"kubernetes.io/projected/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-kube-api-access-vrxf4\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173679 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.173707 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-combined-ca-bundle\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.174781 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.175440 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.176057 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-config\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.177133 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.178209 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.210052 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8ztp\" (UniqueName: \"kubernetes.io/projected/6038548b-5443-43f8-adec-c61ed20a6c2f-kube-api-access-t8ztp\") pod \"dnsmasq-dns-84b966f6c9-xxh7q\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.275891 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-httpd-config\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.275941 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-config\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.276005 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxf4\" (UniqueName: \"kubernetes.io/projected/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-kube-api-access-vrxf4\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.276064 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-combined-ca-bundle\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.276106 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-ovndb-tls-certs\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.280421 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-combined-ca-bundle\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.280958 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-ovndb-tls-certs\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.294652 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-httpd-config\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.300501 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxf4\" (UniqueName: \"kubernetes.io/projected/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-kube-api-access-vrxf4\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.322046 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-config\") pod \"neutron-865b456f44-shc9z\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.344201 4675 scope.go:117] "RemoveContainer" containerID="fb69cf076750f7c9bd61dd2d02d329ffd48ccc53bc41a06aeacdecbc0849be38" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.384580 4675 scope.go:117] "RemoveContainer" containerID="c72d885b2d21c9cbe9ddbeac4ec5c9e97da730c57c9f2a83cd129fe6a71f4ea0" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.507122 4675 scope.go:117] "RemoveContainer" containerID="656791b8ce18a50cf9f243ec1a2e07c49d6ad11a51352693abfedd11a08e5532" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.535293 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.555178 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.652129 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d79d4db6d-vnw9g"] Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.662087 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c454cc68b-lmjfb"] Mar 20 16:22:39 crc kubenswrapper[4675]: W0320 16:22:39.668092 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e972387_c641_42bd_9c3f_69fc70869c8a.slice/crio-5d3bbd99939ccc36e5f13f7951334d98a990c57183616b6d2b3dc2a1f206ca3f WatchSource:0}: Error finding container 5d3bbd99939ccc36e5f13f7951334d98a990c57183616b6d2b3dc2a1f206ca3f: Status 404 returned error can't find the container with id 5d3bbd99939ccc36e5f13f7951334d98a990c57183616b6d2b3dc2a1f206ca3f Mar 20 16:22:39 crc kubenswrapper[4675]: W0320 16:22:39.672270 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb81ab73e_24ce_451b_9064_d6ebea2c5976.slice/crio-a021f30cb4ea3194e7ef7049678e88b1c3537807278e9a6ab708a124e5c74b14 WatchSource:0}: Error finding container a021f30cb4ea3194e7ef7049678e88b1c3537807278e9a6ab708a124e5c74b14: Status 404 returned error can't find the container with id a021f30cb4ea3194e7ef7049678e88b1c3537807278e9a6ab708a124e5c74b14 Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.859953 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zlp5c"] Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.864440 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-546bd65fb7-ktfbp" event={"ID":"b282cfa4-8448-4eef-8463-fca67d9608fd","Type":"ContainerStarted","Data":"0675bef8fe5eaffaa6c1e0c910e6915ea7c74dafe2ab8373e5ebc1df747f5a63"} Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.865933 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c454cc68b-lmjfb" event={"ID":"3e972387-c641-42bd-9c3f-69fc70869c8a","Type":"ContainerStarted","Data":"5d3bbd99939ccc36e5f13f7951334d98a990c57183616b6d2b3dc2a1f206ca3f"} Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.869212 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d79d4db6d-vnw9g" event={"ID":"b81ab73e-24ce-451b-9064-d6ebea2c5976","Type":"ContainerStarted","Data":"a021f30cb4ea3194e7ef7049678e88b1c3537807278e9a6ab708a124e5c74b14"} Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.870557 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9b9d8b55-gbspf" event={"ID":"85c227a3-b831-440b-ab1a-4171217faf81","Type":"ContainerStarted","Data":"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87"} Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.881972 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf9f9b95c-kgcr5" event={"ID":"d94a2d76-92e2-4403-ad6d-e2124b400d78","Type":"ContainerStarted","Data":"91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817"} Mar 20 16:22:39 crc kubenswrapper[4675]: E0320 16:22:39.884023 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-tt28r" podUID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" Mar 20 16:22:39 crc kubenswrapper[4675]: I0320 16:22:39.950394 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.020138 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:40 crc kubenswrapper[4675]: W0320 16:22:40.038269 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e0a711_d2ca_451c_8327_1a045ac918e4.slice/crio-f5b61972d83776956c92f29518aa79e21390386a2df454a98f0c610d6a8bf9ed WatchSource:0}: Error finding container f5b61972d83776956c92f29518aa79e21390386a2df454a98f0c610d6a8bf9ed: Status 404 returned error can't find the container with id f5b61972d83776956c92f29518aa79e21390386a2df454a98f0c610d6a8bf9ed Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.188854 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-xxh7q"] Mar 20 16:22:40 crc kubenswrapper[4675]: W0320 16:22:40.193959 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6038548b_5443_43f8_adec_c61ed20a6c2f.slice/crio-3058832c4d856ee34dc81f9cb4bc74e6bdfc4bb4eeb3935bb83216bd6e595525 WatchSource:0}: Error finding container 3058832c4d856ee34dc81f9cb4bc74e6bdfc4bb4eeb3935bb83216bd6e595525: Status 404 returned error can't find the container with id 3058832c4d856ee34dc81f9cb4bc74e6bdfc4bb4eeb3935bb83216bd6e595525 Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.302669 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-865b456f44-shc9z"] Mar 20 16:22:40 crc kubenswrapper[4675]: W0320 16:22:40.348825 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05eec34e_bcd5_4ba6_ad16_bacbf5cb4b58.slice/crio-ec91cda1e5994f49c175a09687a704fd8ac89307d9b6d622f52124f69dbf9964 WatchSource:0}: Error finding container ec91cda1e5994f49c175a09687a704fd8ac89307d9b6d622f52124f69dbf9964: Status 404 returned error can't find the container with id ec91cda1e5994f49c175a09687a704fd8ac89307d9b6d622f52124f69dbf9964 Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.920235 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf9f9b95c-kgcr5" event={"ID":"d94a2d76-92e2-4403-ad6d-e2124b400d78","Type":"ContainerStarted","Data":"f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.920582 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5bf9f9b95c-kgcr5" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon-log" containerID="cri-o://91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817" gracePeriod=30 Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.920918 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5bf9f9b95c-kgcr5" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon" containerID="cri-o://f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7" gracePeriod=30 Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.933043 4675 generic.go:334] "Generic (PLEG): container finished" podID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerID="dec817acbd187c5374d806e20a1e31335626743dd9e3d6d16cdbab7a46c8fea5" exitCode=0 Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.933157 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" event={"ID":"6038548b-5443-43f8-adec-c61ed20a6c2f","Type":"ContainerDied","Data":"dec817acbd187c5374d806e20a1e31335626743dd9e3d6d16cdbab7a46c8fea5"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.933188 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" event={"ID":"6038548b-5443-43f8-adec-c61ed20a6c2f","Type":"ContainerStarted","Data":"3058832c4d856ee34dc81f9cb4bc74e6bdfc4bb4eeb3935bb83216bd6e595525"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.959207 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d79d4db6d-vnw9g" event={"ID":"b81ab73e-24ce-451b-9064-d6ebea2c5976","Type":"ContainerStarted","Data":"e4e169e96e53f774cf8b082e1f0c8434fcd58abb9366bcfc0a611d6b71610c95"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.966721 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7tdnv" event={"ID":"653f25dd-b7f2-4ec1-8569-96af48c4c388","Type":"ContainerStarted","Data":"1fd993bae4789312a411c055de30db24256d857c08bf7976bb053c0533e6ac11"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.972245 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5bf9f9b95c-kgcr5" podStartSLOduration=4.157375266 podStartE2EDuration="32.972222978s" podCreationTimestamp="2026-03-20 16:22:08 +0000 UTC" firstStartedPulling="2026-03-20 16:22:10.287689963 +0000 UTC m=+1250.321319500" lastFinishedPulling="2026-03-20 16:22:39.102537665 +0000 UTC m=+1279.136167212" observedRunningTime="2026-03-20 16:22:40.937153762 +0000 UTC m=+1280.970783299" watchObservedRunningTime="2026-03-20 16:22:40.972222978 +0000 UTC m=+1281.005852515" Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.981513 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zlp5c" event={"ID":"de350b0e-5712-4f65-b01b-27814457bee4","Type":"ContainerStarted","Data":"6ad697ba14cc1d555c40e10733a5e068a20269a2df62c60822265fc27336cae1"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.981583 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zlp5c" event={"ID":"de350b0e-5712-4f65-b01b-27814457bee4","Type":"ContainerStarted","Data":"ed2caaf22d7e3f171310e03aafcea00949fc8d36eab19fcb2c3c8d0fee5645d6"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.984608 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4f4c66-7029-49ad-aa71-38faa62d3178","Type":"ContainerStarted","Data":"ab5b58b6bf2e1a2e52e3f98777d2d6b997308d88bcc779a5ba96a6dda56d377e"} Mar 20 16:22:40 crc kubenswrapper[4675]: I0320 16:22:40.984662 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4f4c66-7029-49ad-aa71-38faa62d3178","Type":"ContainerStarted","Data":"e6c224423a1ca91154a2184b308505a5d0694fe730fc9c6a3fa51e68bbbdd178"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.001074 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-546bd65fb7-ktfbp" event={"ID":"b282cfa4-8448-4eef-8463-fca67d9608fd","Type":"ContainerStarted","Data":"0ea697f15b863561ce440e2aacebf2bd03ae127718dffe4eab7a26be5517aab0"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.001252 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-546bd65fb7-ktfbp" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon" containerID="cri-o://0ea697f15b863561ce440e2aacebf2bd03ae127718dffe4eab7a26be5517aab0" gracePeriod=30 Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.001203 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-546bd65fb7-ktfbp" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon-log" containerID="cri-o://0675bef8fe5eaffaa6c1e0c910e6915ea7c74dafe2ab8373e5ebc1df747f5a63" gracePeriod=30 Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.012102 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c454cc68b-lmjfb" event={"ID":"3e972387-c641-42bd-9c3f-69fc70869c8a","Type":"ContainerStarted","Data":"6186dbac985c54733a1640322c538c03ab0fe9a306fd411d2f745642ee54de5f"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.028065 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7tdnv" podStartSLOduration=3.793873177 podStartE2EDuration="32.028044271s" podCreationTimestamp="2026-03-20 16:22:09 +0000 UTC" firstStartedPulling="2026-03-20 16:22:10.866668233 +0000 UTC m=+1250.900297770" lastFinishedPulling="2026-03-20 16:22:39.100839327 +0000 UTC m=+1279.134468864" observedRunningTime="2026-03-20 16:22:40.9884717 +0000 UTC m=+1281.022101237" watchObservedRunningTime="2026-03-20 16:22:41.028044271 +0000 UTC m=+1281.061673808" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.039305 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zlp5c" podStartSLOduration=11.039285054 podStartE2EDuration="11.039285054s" podCreationTimestamp="2026-03-20 16:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:41.008027084 +0000 UTC m=+1281.041656611" watchObservedRunningTime="2026-03-20 16:22:41.039285054 +0000 UTC m=+1281.072914591" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.042051 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerStarted","Data":"cfae4fcd530731710080b8d955c3d7b3698573155592d864f9bc377d876d1c7f"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.058174 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-546bd65fb7-ktfbp" podStartSLOduration=3.998258274 podStartE2EDuration="32.058154229s" podCreationTimestamp="2026-03-20 16:22:09 +0000 UTC" firstStartedPulling="2026-03-20 16:22:10.872009702 +0000 UTC m=+1250.905639239" lastFinishedPulling="2026-03-20 16:22:38.931905657 +0000 UTC m=+1278.965535194" observedRunningTime="2026-03-20 16:22:41.035524699 +0000 UTC m=+1281.069154236" watchObservedRunningTime="2026-03-20 16:22:41.058154229 +0000 UTC m=+1281.091783766" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.068058 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9b9d8b55-gbspf" event={"ID":"85c227a3-b831-440b-ab1a-4171217faf81","Type":"ContainerStarted","Data":"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.068315 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b9b9d8b55-gbspf" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon-log" containerID="cri-o://e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87" gracePeriod=30 Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.068407 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b9b9d8b55-gbspf" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon" containerID="cri-o://a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d" gracePeriod=30 Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.078400 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-865b456f44-shc9z" event={"ID":"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58","Type":"ContainerStarted","Data":"fae7ab551b9c0037632bcaffa816c711fb33b545e85c0e139da1661897c23f42"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.078453 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-865b456f44-shc9z" event={"ID":"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58","Type":"ContainerStarted","Data":"ec91cda1e5994f49c175a09687a704fd8ac89307d9b6d622f52124f69dbf9964"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.084367 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e0a711-d2ca-451c-8327-1a045ac918e4","Type":"ContainerStarted","Data":"f5b61972d83776956c92f29518aa79e21390386a2df454a98f0c610d6a8bf9ed"} Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.089207 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b9b9d8b55-gbspf" podStartSLOduration=4.395538915 podStartE2EDuration="30.089192802s" podCreationTimestamp="2026-03-20 16:22:11 +0000 UTC" firstStartedPulling="2026-03-20 16:22:13.328695947 +0000 UTC m=+1253.362325484" lastFinishedPulling="2026-03-20 16:22:39.022349834 +0000 UTC m=+1279.055979371" observedRunningTime="2026-03-20 16:22:41.088121053 +0000 UTC m=+1281.121750590" watchObservedRunningTime="2026-03-20 16:22:41.089192802 +0000 UTC m=+1281.122822339" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.602242 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b6db44d49-fvfv9"] Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.621577 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.624250 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.624500 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.646293 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b6db44d49-fvfv9"] Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.764789 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-combined-ca-bundle\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.764951 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-httpd-config\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.765008 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-config\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.765053 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-internal-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.765090 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-public-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.765125 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkc5k\" (UniqueName: \"kubernetes.io/projected/e7c2420c-dca0-41c9-8f28-79a5337c7444-kube-api-access-xkc5k\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.765157 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-ovndb-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867524 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-httpd-config\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867558 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-config\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867602 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-internal-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-public-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867657 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkc5k\" (UniqueName: \"kubernetes.io/projected/e7c2420c-dca0-41c9-8f28-79a5337c7444-kube-api-access-xkc5k\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867691 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-ovndb-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.867710 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-combined-ca-bundle\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.879611 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-public-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.886705 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-internal-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.889459 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-httpd-config\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.890257 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-combined-ca-bundle\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.892385 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-ovndb-tls-certs\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.893727 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-config\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.907939 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkc5k\" (UniqueName: \"kubernetes.io/projected/e7c2420c-dca0-41c9-8f28-79a5337c7444-kube-api-access-xkc5k\") pod \"neutron-b6db44d49-fvfv9\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:41 crc kubenswrapper[4675]: I0320 16:22:41.949359 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.119560 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d79d4db6d-vnw9g" event={"ID":"b81ab73e-24ce-451b-9064-d6ebea2c5976","Type":"ContainerStarted","Data":"c3b61933084eac34146f74c14d4cd643b9ee350d9b791cfb2ea1130b26ac19fe"} Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.126830 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-865b456f44-shc9z" event={"ID":"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58","Type":"ContainerStarted","Data":"e41dc63a82c84b7e7442f52147268d76150695d914c4d93e30a5de24cdca7695"} Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.127606 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.129763 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e0a711-d2ca-451c-8327-1a045ac918e4","Type":"ContainerStarted","Data":"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0"} Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.158384 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" event={"ID":"6038548b-5443-43f8-adec-c61ed20a6c2f","Type":"ContainerStarted","Data":"b98ea38e522301d1725bf630c264a8f413e300bd1d35097ce7fa4916c2baddc0"} Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.158802 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.201060 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c454cc68b-lmjfb" event={"ID":"3e972387-c641-42bd-9c3f-69fc70869c8a","Type":"ContainerStarted","Data":"94241f316e66ac5988644a931eae1198f8d67b4991a0efd911a7d74bc29be0f7"} Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.224265 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d79d4db6d-vnw9g" podStartSLOduration=24.224243294 podStartE2EDuration="24.224243294s" podCreationTimestamp="2026-03-20 16:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:42.158242907 +0000 UTC m=+1282.191872444" watchObservedRunningTime="2026-03-20 16:22:42.224243294 +0000 UTC m=+1282.257872831" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.225584 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-865b456f44-shc9z" podStartSLOduration=3.225577731 podStartE2EDuration="3.225577731s" podCreationTimestamp="2026-03-20 16:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:42.223927565 +0000 UTC m=+1282.257557102" watchObservedRunningTime="2026-03-20 16:22:42.225577731 +0000 UTC m=+1282.259207268" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.257748 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c454cc68b-lmjfb" podStartSLOduration=24.257732395 podStartE2EDuration="24.257732395s" podCreationTimestamp="2026-03-20 16:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:42.256016788 +0000 UTC m=+1282.289646325" watchObservedRunningTime="2026-03-20 16:22:42.257732395 +0000 UTC m=+1282.291361932" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.291479 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" podStartSLOduration=4.291460414 podStartE2EDuration="4.291460414s" podCreationTimestamp="2026-03-20 16:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:42.286248859 +0000 UTC m=+1282.319878396" watchObservedRunningTime="2026-03-20 16:22:42.291460414 +0000 UTC m=+1282.325089951" Mar 20 16:22:42 crc kubenswrapper[4675]: I0320 16:22:42.574264 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.215765 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e0a711-d2ca-451c-8327-1a045ac918e4","Type":"ContainerStarted","Data":"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e"} Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.216075 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-log" containerID="cri-o://c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0" gracePeriod=30 Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.216151 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-httpd" containerID="cri-o://36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e" gracePeriod=30 Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.224512 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4f4c66-7029-49ad-aa71-38faa62d3178","Type":"ContainerStarted","Data":"d866f3970e9e9bc778662ccdfa3abf3d53ca0536f21f70f0a0b5c984d37bae6f"} Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.227307 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b6db44d49-fvfv9"] Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.229236 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerStarted","Data":"2ba333a09bb26c779a9007ef8f5f2b0f1a347e974a46a84d5905bff9abf957d7"} Mar 20 16:22:44 crc kubenswrapper[4675]: W0320 16:22:44.232712 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c2420c_dca0_41c9_8f28_79a5337c7444.slice/crio-f8e1a0252f802d37a70ca8af7f07735d8d82bb1f171df95f6806b394b8e1c101 WatchSource:0}: Error finding container f8e1a0252f802d37a70ca8af7f07735d8d82bb1f171df95f6806b394b8e1c101: Status 404 returned error can't find the container with id f8e1a0252f802d37a70ca8af7f07735d8d82bb1f171df95f6806b394b8e1c101 Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.261213 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=27.261195331 podStartE2EDuration="27.261195331s" podCreationTimestamp="2026-03-20 16:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:44.246477291 +0000 UTC m=+1284.280106828" watchObservedRunningTime="2026-03-20 16:22:44.261195331 +0000 UTC m=+1284.294824868" Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.287959 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.287912664 podStartE2EDuration="7.287912664s" podCreationTimestamp="2026-03-20 16:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:44.272125775 +0000 UTC m=+1284.305755302" watchObservedRunningTime="2026-03-20 16:22:44.287912664 +0000 UTC m=+1284.321542201" Mar 20 16:22:44 crc kubenswrapper[4675]: I0320 16:22:44.965539 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042475 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042578 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-config-data\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042641 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-scripts\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042768 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-public-tls-certs\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042866 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqq5\" (UniqueName: \"kubernetes.io/projected/e4e0a711-d2ca-451c-8327-1a045ac918e4-kube-api-access-6mqq5\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042917 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-httpd-run\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.042975 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-logs\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.043003 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-combined-ca-bundle\") pod \"e4e0a711-d2ca-451c-8327-1a045ac918e4\" (UID: \"e4e0a711-d2ca-451c-8327-1a045ac918e4\") " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.046842 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.046863 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-logs" (OuterVolumeSpecName: "logs") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.055039 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.057251 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e0a711-d2ca-451c-8327-1a045ac918e4-kube-api-access-6mqq5" (OuterVolumeSpecName: "kube-api-access-6mqq5") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "kube-api-access-6mqq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.076985 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-scripts" (OuterVolumeSpecName: "scripts") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.134375 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.142760 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.154456 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-config-data" (OuterVolumeSpecName: "config-data") pod "e4e0a711-d2ca-451c-8327-1a045ac918e4" (UID: "e4e0a711-d2ca-451c-8327-1a045ac918e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156351 4675 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156388 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqq5\" (UniqueName: \"kubernetes.io/projected/e4e0a711-d2ca-451c-8327-1a045ac918e4-kube-api-access-6mqq5\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156399 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156407 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e0a711-d2ca-451c-8327-1a045ac918e4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156418 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156441 4675 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156451 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.156460 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e0a711-d2ca-451c-8327-1a045ac918e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.186739 4675 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.240026 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6db44d49-fvfv9" event={"ID":"e7c2420c-dca0-41c9-8f28-79a5337c7444","Type":"ContainerStarted","Data":"43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.240094 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6db44d49-fvfv9" event={"ID":"e7c2420c-dca0-41c9-8f28-79a5337c7444","Type":"ContainerStarted","Data":"e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.240105 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6db44d49-fvfv9" event={"ID":"e7c2420c-dca0-41c9-8f28-79a5337c7444","Type":"ContainerStarted","Data":"f8e1a0252f802d37a70ca8af7f07735d8d82bb1f171df95f6806b394b8e1c101"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.241102 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.253507 4675 generic.go:334] "Generic (PLEG): container finished" podID="653f25dd-b7f2-4ec1-8569-96af48c4c388" containerID="1fd993bae4789312a411c055de30db24256d857c08bf7976bb053c0533e6ac11" exitCode=0 Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.253591 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7tdnv" event={"ID":"653f25dd-b7f2-4ec1-8569-96af48c4c388","Type":"ContainerDied","Data":"1fd993bae4789312a411c055de30db24256d857c08bf7976bb053c0533e6ac11"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257470 4675 generic.go:334] "Generic (PLEG): container finished" podID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerID="36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e" exitCode=0 Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257491 4675 generic.go:334] "Generic (PLEG): container finished" podID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerID="c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0" exitCode=143 Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257550 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e0a711-d2ca-451c-8327-1a045ac918e4","Type":"ContainerDied","Data":"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257570 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e0a711-d2ca-451c-8327-1a045ac918e4","Type":"ContainerDied","Data":"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257578 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e0a711-d2ca-451c-8327-1a045ac918e4","Type":"ContainerDied","Data":"f5b61972d83776956c92f29518aa79e21390386a2df454a98f0c610d6a8bf9ed"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257623 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.257596 4675 scope.go:117] "RemoveContainer" containerID="36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.259150 4675 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.270640 4675 generic.go:334] "Generic (PLEG): container finished" podID="de350b0e-5712-4f65-b01b-27814457bee4" containerID="6ad697ba14cc1d555c40e10733a5e068a20269a2df62c60822265fc27336cae1" exitCode=0 Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.271359 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zlp5c" event={"ID":"de350b0e-5712-4f65-b01b-27814457bee4","Type":"ContainerDied","Data":"6ad697ba14cc1d555c40e10733a5e068a20269a2df62c60822265fc27336cae1"} Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.278840 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b6db44d49-fvfv9" podStartSLOduration=4.278818704 podStartE2EDuration="4.278818704s" podCreationTimestamp="2026-03-20 16:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:45.264738803 +0000 UTC m=+1285.298368360" watchObservedRunningTime="2026-03-20 16:22:45.278818704 +0000 UTC m=+1285.312448241" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.335542 4675 scope.go:117] "RemoveContainer" containerID="c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.371756 4675 scope.go:117] "RemoveContainer" containerID="36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e" Mar 20 16:22:45 crc kubenswrapper[4675]: E0320 16:22:45.374970 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e\": container with ID starting with 36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e not found: ID does not exist" containerID="36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375006 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e"} err="failed to get container status \"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e\": rpc error: code = NotFound desc = could not find container \"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e\": container with ID starting with 36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e not found: ID does not exist" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375035 4675 scope.go:117] "RemoveContainer" containerID="c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0" Mar 20 16:22:45 crc kubenswrapper[4675]: E0320 16:22:45.375305 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0\": container with ID starting with c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0 not found: ID does not exist" containerID="c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375338 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0"} err="failed to get container status \"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0\": rpc error: code = NotFound desc = could not find container \"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0\": container with ID starting with c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0 not found: ID does not exist" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375362 4675 scope.go:117] "RemoveContainer" containerID="36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375564 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e"} err="failed to get container status \"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e\": rpc error: code = NotFound desc = could not find container \"36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e\": container with ID starting with 36f57cdd1c896ec833a9939c0f43fccb1d5ce6ae2a4f5a9bdb1d201f31d51c6e not found: ID does not exist" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375576 4675 scope.go:117] "RemoveContainer" containerID="c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.375737 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0"} err="failed to get container status \"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0\": rpc error: code = NotFound desc = could not find container \"c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0\": container with ID starting with c13b142c068103d3e36cf726c76c33ffb3307e72a1e30cc7bc0bded5dd8125f0 not found: ID does not exist" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.379911 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.393002 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.431840 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:45 crc kubenswrapper[4675]: E0320 16:22:45.432179 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-httpd" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.432194 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-httpd" Mar 20 16:22:45 crc kubenswrapper[4675]: E0320 16:22:45.432206 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-log" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.432213 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-log" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.432398 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-httpd" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.432429 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" containerName="glance-log" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.433270 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.436283 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.440160 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.440449 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571059 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571242 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnhz2\" (UniqueName: \"kubernetes.io/projected/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-kube-api-access-vnhz2\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571290 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571325 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571365 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571394 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571510 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-logs\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.571544 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672594 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnhz2\" (UniqueName: \"kubernetes.io/projected/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-kube-api-access-vnhz2\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672703 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672728 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672751 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672770 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672838 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.672853 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-logs\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.673181 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.673694 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.675180 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-logs\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.677851 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.680282 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.680817 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.701464 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.707907 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnhz2\" (UniqueName: \"kubernetes.io/projected/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-kube-api-access-vnhz2\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.750296 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:45 crc kubenswrapper[4675]: I0320 16:22:45.760421 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.488620 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:46 crc kubenswrapper[4675]: W0320 16:22:46.547654 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9de8c7f_b5e1_4222_ac09_24fa8e26b089.slice/crio-546bfc4aa6bc3c969b064fd16c9388c7fffb08c84df37c65b4ba469ea614034b WatchSource:0}: Error finding container 546bfc4aa6bc3c969b064fd16c9388c7fffb08c84df37c65b4ba469ea614034b: Status 404 returned error can't find the container with id 546bfc4aa6bc3c969b064fd16c9388c7fffb08c84df37c65b4ba469ea614034b Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.693080 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e0a711-d2ca-451c-8327-1a045ac918e4" path="/var/lib/kubelet/pods/e4e0a711-d2ca-451c-8327-1a045ac918e4/volumes" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.843538 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.857321 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898175 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6hz6\" (UniqueName: \"kubernetes.io/projected/653f25dd-b7f2-4ec1-8569-96af48c4c388-kube-api-access-l6hz6\") pod \"653f25dd-b7f2-4ec1-8569-96af48c4c388\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898229 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-combined-ca-bundle\") pod \"653f25dd-b7f2-4ec1-8569-96af48c4c388\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898354 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-config-data\") pod \"653f25dd-b7f2-4ec1-8569-96af48c4c388\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898380 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-combined-ca-bundle\") pod \"de350b0e-5712-4f65-b01b-27814457bee4\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898422 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-config-data\") pod \"de350b0e-5712-4f65-b01b-27814457bee4\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898470 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-fernet-keys\") pod \"de350b0e-5712-4f65-b01b-27814457bee4\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898519 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x287\" (UniqueName: \"kubernetes.io/projected/de350b0e-5712-4f65-b01b-27814457bee4-kube-api-access-6x287\") pod \"de350b0e-5712-4f65-b01b-27814457bee4\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898558 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-credential-keys\") pod \"de350b0e-5712-4f65-b01b-27814457bee4\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898651 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653f25dd-b7f2-4ec1-8569-96af48c4c388-logs\") pod \"653f25dd-b7f2-4ec1-8569-96af48c4c388\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898674 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-scripts\") pod \"653f25dd-b7f2-4ec1-8569-96af48c4c388\" (UID: \"653f25dd-b7f2-4ec1-8569-96af48c4c388\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.898695 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-scripts\") pod \"de350b0e-5712-4f65-b01b-27814457bee4\" (UID: \"de350b0e-5712-4f65-b01b-27814457bee4\") " Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.903797 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653f25dd-b7f2-4ec1-8569-96af48c4c388-kube-api-access-l6hz6" (OuterVolumeSpecName: "kube-api-access-l6hz6") pod "653f25dd-b7f2-4ec1-8569-96af48c4c388" (UID: "653f25dd-b7f2-4ec1-8569-96af48c4c388"). InnerVolumeSpecName "kube-api-access-l6hz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.904270 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "de350b0e-5712-4f65-b01b-27814457bee4" (UID: "de350b0e-5712-4f65-b01b-27814457bee4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.917162 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/653f25dd-b7f2-4ec1-8569-96af48c4c388-logs" (OuterVolumeSpecName: "logs") pod "653f25dd-b7f2-4ec1-8569-96af48c4c388" (UID: "653f25dd-b7f2-4ec1-8569-96af48c4c388"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.937017 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "de350b0e-5712-4f65-b01b-27814457bee4" (UID: "de350b0e-5712-4f65-b01b-27814457bee4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.952348 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-scripts" (OuterVolumeSpecName: "scripts") pod "de350b0e-5712-4f65-b01b-27814457bee4" (UID: "de350b0e-5712-4f65-b01b-27814457bee4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.953320 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-scripts" (OuterVolumeSpecName: "scripts") pod "653f25dd-b7f2-4ec1-8569-96af48c4c388" (UID: "653f25dd-b7f2-4ec1-8569-96af48c4c388"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.956887 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de350b0e-5712-4f65-b01b-27814457bee4-kube-api-access-6x287" (OuterVolumeSpecName: "kube-api-access-6x287") pod "de350b0e-5712-4f65-b01b-27814457bee4" (UID: "de350b0e-5712-4f65-b01b-27814457bee4"). InnerVolumeSpecName "kube-api-access-6x287". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.959737 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de350b0e-5712-4f65-b01b-27814457bee4" (UID: "de350b0e-5712-4f65-b01b-27814457bee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.963948 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-config-data" (OuterVolumeSpecName: "config-data") pod "653f25dd-b7f2-4ec1-8569-96af48c4c388" (UID: "653f25dd-b7f2-4ec1-8569-96af48c4c388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:46 crc kubenswrapper[4675]: I0320 16:22:46.984604 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-config-data" (OuterVolumeSpecName: "config-data") pod "de350b0e-5712-4f65-b01b-27814457bee4" (UID: "de350b0e-5712-4f65-b01b-27814457bee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001892 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001942 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001955 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001964 4675 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001973 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x287\" (UniqueName: \"kubernetes.io/projected/de350b0e-5712-4f65-b01b-27814457bee4-kube-api-access-6x287\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001982 4675 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001991 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/653f25dd-b7f2-4ec1-8569-96af48c4c388-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.001999 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.002007 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de350b0e-5712-4f65-b01b-27814457bee4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.002014 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6hz6\" (UniqueName: \"kubernetes.io/projected/653f25dd-b7f2-4ec1-8569-96af48c4c388-kube-api-access-l6hz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.028933 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "653f25dd-b7f2-4ec1-8569-96af48c4c388" (UID: "653f25dd-b7f2-4ec1-8569-96af48c4c388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.103309 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653f25dd-b7f2-4ec1-8569-96af48c4c388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.293625 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9de8c7f-b5e1-4222-ac09-24fa8e26b089","Type":"ContainerStarted","Data":"546bfc4aa6bc3c969b064fd16c9388c7fffb08c84df37c65b4ba469ea614034b"} Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.296196 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7tdnv" event={"ID":"653f25dd-b7f2-4ec1-8569-96af48c4c388","Type":"ContainerDied","Data":"01ae6f1499eed2aec66a74aa556c358bf859fab0ad3474ce6deb717687a6575b"} Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.296228 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01ae6f1499eed2aec66a74aa556c358bf859fab0ad3474ce6deb717687a6575b" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.296284 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7tdnv" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.321376 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zlp5c" event={"ID":"de350b0e-5712-4f65-b01b-27814457bee4","Type":"ContainerDied","Data":"ed2caaf22d7e3f171310e03aafcea00949fc8d36eab19fcb2c3c8d0fee5645d6"} Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.321423 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed2caaf22d7e3f171310e03aafcea00949fc8d36eab19fcb2c3c8d0fee5645d6" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.321491 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zlp5c" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.406918 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-678454f7d4-qg8vh"] Mar 20 16:22:47 crc kubenswrapper[4675]: E0320 16:22:47.407269 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de350b0e-5712-4f65-b01b-27814457bee4" containerName="keystone-bootstrap" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.407285 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="de350b0e-5712-4f65-b01b-27814457bee4" containerName="keystone-bootstrap" Mar 20 16:22:47 crc kubenswrapper[4675]: E0320 16:22:47.407320 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653f25dd-b7f2-4ec1-8569-96af48c4c388" containerName="placement-db-sync" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.407327 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="653f25dd-b7f2-4ec1-8569-96af48c4c388" containerName="placement-db-sync" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.407472 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="de350b0e-5712-4f65-b01b-27814457bee4" containerName="keystone-bootstrap" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.407515 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="653f25dd-b7f2-4ec1-8569-96af48c4c388" containerName="placement-db-sync" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.410605 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.416608 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5zhjr" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.416909 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.416940 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.417806 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.418709 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.420034 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-678454f7d4-qg8vh"] Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.493924 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f7ccf99c9-m6s8x"] Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.495045 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.496624 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.497093 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.497243 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.497323 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.497584 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk87c" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.499066 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.507883 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-internal-tls-certs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.507951 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vdk\" (UniqueName: \"kubernetes.io/projected/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-kube-api-access-v4vdk\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.507981 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-public-tls-certs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.508013 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-config-data\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.508050 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-scripts\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.508089 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-combined-ca-bundle\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.508127 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-logs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.516795 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f7ccf99c9-m6s8x"] Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612046 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-internal-tls-certs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612095 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-fernet-keys\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612128 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvlz\" (UniqueName: \"kubernetes.io/projected/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-kube-api-access-bbvlz\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612159 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vdk\" (UniqueName: \"kubernetes.io/projected/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-kube-api-access-v4vdk\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612185 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-public-tls-certs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612203 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-config-data\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612222 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-combined-ca-bundle\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612236 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-public-tls-certs\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612251 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-config-data\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612290 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-scripts\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612314 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-internal-tls-certs\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612347 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-combined-ca-bundle\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612377 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-credential-keys\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612402 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-logs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.612427 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-scripts\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.616663 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-public-tls-certs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.616958 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-internal-tls-certs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.617206 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-logs\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.623181 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-config-data\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.626270 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-scripts\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.634471 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-combined-ca-bundle\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.652415 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vdk\" (UniqueName: \"kubernetes.io/projected/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-kube-api-access-v4vdk\") pod \"placement-678454f7d4-qg8vh\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.713460 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-credential-keys\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.713965 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-scripts\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.714006 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-fernet-keys\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.714032 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvlz\" (UniqueName: \"kubernetes.io/projected/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-kube-api-access-bbvlz\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.714066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-config-data\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.714083 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-combined-ca-bundle\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.714102 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-public-tls-certs\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.714148 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-internal-tls-certs\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.725465 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-config-data\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.726054 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-combined-ca-bundle\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.726461 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-internal-tls-certs\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.732371 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-fernet-keys\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.733725 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-credential-keys\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.744049 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.746415 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-scripts\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.767337 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-public-tls-certs\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.767470 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvlz\" (UniqueName: \"kubernetes.io/projected/16cf399d-ef4a-4572-a3c3-73e30bb2a54c-kube-api-access-bbvlz\") pod \"keystone-5f7ccf99c9-m6s8x\" (UID: \"16cf399d-ef4a-4572-a3c3-73e30bb2a54c\") " pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:47 crc kubenswrapper[4675]: I0320 16:22:47.843306 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.241562 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.241924 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.315000 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.335616 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.357265 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9de8c7f-b5e1-4222-ac09-24fa8e26b089","Type":"ContainerStarted","Data":"03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd"} Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.357302 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.357442 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.455852 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f7ccf99c9-m6s8x"] Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.510865 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-678454f7d4-qg8vh"] Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.705900 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.705947 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.822549 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:48 crc kubenswrapper[4675]: I0320 16:22:48.825981 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.256699 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.370554 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9de8c7f-b5e1-4222-ac09-24fa8e26b089","Type":"ContainerStarted","Data":"3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630"} Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.399125 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.399102478 podStartE2EDuration="4.399102478s" podCreationTimestamp="2026-03-20 16:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:49.39592019 +0000 UTC m=+1289.429549727" watchObservedRunningTime="2026-03-20 16:22:49.399102478 +0000 UTC m=+1289.432732015" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.537967 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.607804 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-s745j"] Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.608091 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="dnsmasq-dns" containerID="cri-o://f61f12a86cdebbbb94dfda6cf319ec5fb54b82b18f966819d172e491c5e172f5" gracePeriod=10 Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.831636 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-cf47f4dbd-zbrxj"] Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.833024 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.833145 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.853306 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cf47f4dbd-zbrxj"] Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.987857 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-internal-tls-certs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.987974 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-combined-ca-bundle\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.988163 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-public-tls-certs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.988281 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfzc\" (UniqueName: \"kubernetes.io/projected/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-kube-api-access-5dfzc\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.988329 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-scripts\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.988356 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-config-data\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:49 crc kubenswrapper[4675]: I0320 16:22:49.988493 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-logs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.086147 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090207 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-public-tls-certs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090276 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfzc\" (UniqueName: \"kubernetes.io/projected/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-kube-api-access-5dfzc\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090302 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-scripts\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090324 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-config-data\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090377 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-logs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090397 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-internal-tls-certs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.090431 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-combined-ca-bundle\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.091527 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-logs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.096688 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-internal-tls-certs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.097923 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-scripts\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.098266 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-public-tls-certs\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.098696 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-combined-ca-bundle\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.099596 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-config-data\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.111985 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfzc\" (UniqueName: \"kubernetes.io/projected/faf8d346-4de8-473c-8d9c-a5d1cf895e4e-kube-api-access-5dfzc\") pod \"placement-cf47f4dbd-zbrxj\" (UID: \"faf8d346-4de8-473c-8d9c-a5d1cf895e4e\") " pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.160799 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.384088 4675 generic.go:334] "Generic (PLEG): container finished" podID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerID="f61f12a86cdebbbb94dfda6cf319ec5fb54b82b18f966819d172e491c5e172f5" exitCode=0 Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.384163 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" event={"ID":"a85fb28f-af43-4d9b-acf3-71c4f4494daa","Type":"ContainerDied","Data":"f61f12a86cdebbbb94dfda6cf319ec5fb54b82b18f966819d172e491c5e172f5"} Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.384219 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.384230 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:50 crc kubenswrapper[4675]: I0320 16:22:50.418258 4675 scope.go:117] "RemoveContainer" containerID="9dfba5c9c08b8127e29669df841cfb70228b4a7a8415cb43c4a31498c4a432ec" Mar 20 16:22:51 crc kubenswrapper[4675]: I0320 16:22:51.051660 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:51 crc kubenswrapper[4675]: I0320 16:22:51.060933 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4675]: W0320 16:22:53.812802 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16cf399d_ef4a_4572_a3c3_73e30bb2a54c.slice/crio-7f23bb0b5e306fc231dac8b442f03be77cb57b6223d1a409735263487bcd4959 WatchSource:0}: Error finding container 7f23bb0b5e306fc231dac8b442f03be77cb57b6223d1a409735263487bcd4959: Status 404 returned error can't find the container with id 7f23bb0b5e306fc231dac8b442f03be77cb57b6223d1a409735263487bcd4959 Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.141571 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.271443 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-swift-storage-0\") pod \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.271575 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-nb\") pod \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.277881 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-config\") pod \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.278031 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-sb\") pod \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.278103 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kb8\" (UniqueName: \"kubernetes.io/projected/a85fb28f-af43-4d9b-acf3-71c4f4494daa-kube-api-access-59kb8\") pod \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.278150 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-svc\") pod \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\" (UID: \"a85fb28f-af43-4d9b-acf3-71c4f4494daa\") " Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.299409 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85fb28f-af43-4d9b-acf3-71c4f4494daa-kube-api-access-59kb8" (OuterVolumeSpecName: "kube-api-access-59kb8") pod "a85fb28f-af43-4d9b-acf3-71c4f4494daa" (UID: "a85fb28f-af43-4d9b-acf3-71c4f4494daa"). InnerVolumeSpecName "kube-api-access-59kb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.394721 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kb8\" (UniqueName: \"kubernetes.io/projected/a85fb28f-af43-4d9b-acf3-71c4f4494daa-kube-api-access-59kb8\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.439174 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" event={"ID":"a85fb28f-af43-4d9b-acf3-71c4f4494daa","Type":"ContainerDied","Data":"93ec57987c99e373c7454ed8d20b47fd7e13c16cf3013a096f1e6b5f6d86e1ed"} Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.439241 4675 scope.go:117] "RemoveContainer" containerID="f61f12a86cdebbbb94dfda6cf319ec5fb54b82b18f966819d172e491c5e172f5" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.439393 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-s745j" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.444666 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678454f7d4-qg8vh" event={"ID":"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c","Type":"ContainerStarted","Data":"7d10e5e5940feb7328a72f0d18a49eec7df7eac2a095c1feb308d783b2bb73f1"} Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.449545 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f7ccf99c9-m6s8x" event={"ID":"16cf399d-ef4a-4572-a3c3-73e30bb2a54c","Type":"ContainerStarted","Data":"7f23bb0b5e306fc231dac8b442f03be77cb57b6223d1a409735263487bcd4959"} Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.471091 4675 scope.go:117] "RemoveContainer" containerID="577144b8f6a3076b1c84be34d3bb4ee47f990061b33ae0c33245b8c83f1006ed" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.476267 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cf47f4dbd-zbrxj"] Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.940667 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a85fb28f-af43-4d9b-acf3-71c4f4494daa" (UID: "a85fb28f-af43-4d9b-acf3-71c4f4494daa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.977002 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a85fb28f-af43-4d9b-acf3-71c4f4494daa" (UID: "a85fb28f-af43-4d9b-acf3-71c4f4494daa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.979687 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-config" (OuterVolumeSpecName: "config") pod "a85fb28f-af43-4d9b-acf3-71c4f4494daa" (UID: "a85fb28f-af43-4d9b-acf3-71c4f4494daa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:54 crc kubenswrapper[4675]: I0320 16:22:54.998290 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a85fb28f-af43-4d9b-acf3-71c4f4494daa" (UID: "a85fb28f-af43-4d9b-acf3-71c4f4494daa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.019095 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.019137 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.019150 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.019161 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.038412 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a85fb28f-af43-4d9b-acf3-71c4f4494daa" (UID: "a85fb28f-af43-4d9b-acf3-71c4f4494daa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.123061 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a85fb28f-af43-4d9b-acf3-71c4f4494daa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.193199 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-s745j"] Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.228674 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-s745j"] Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.460221 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678454f7d4-qg8vh" event={"ID":"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c","Type":"ContainerStarted","Data":"6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.460625 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678454f7d4-qg8vh" event={"ID":"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c","Type":"ContainerStarted","Data":"36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.460692 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.460797 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.461960 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f7ccf99c9-m6s8x" event={"ID":"16cf399d-ef4a-4572-a3c3-73e30bb2a54c","Type":"ContainerStarted","Data":"06777ab309c4e8465b518aeff9fdf2b09061dc0949322d2597718aaf7bba4b9e"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.462062 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.463644 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cf47f4dbd-zbrxj" event={"ID":"faf8d346-4de8-473c-8d9c-a5d1cf895e4e","Type":"ContainerStarted","Data":"a75fae1d587d2fcbd5325e767e94740ec3eae45a5e61168902cb6b7624ac466c"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.473516 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt28r" event={"ID":"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43","Type":"ContainerStarted","Data":"c4694069148f8729d13a12a9d94b6b5be6dd7d4973f260458e56c3d138e7f773"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.500159 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-678454f7d4-qg8vh" podStartSLOduration=8.500127666000001 podStartE2EDuration="8.500127666s" podCreationTimestamp="2026-03-20 16:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:55.490122318 +0000 UTC m=+1295.523751845" watchObservedRunningTime="2026-03-20 16:22:55.500127666 +0000 UTC m=+1295.533757203" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.521599 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerStarted","Data":"6c888e5219681ab88ebaa1c9f5d19285794279a36f3d1138e85af33547fc2ced"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.533373 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hhfc9" event={"ID":"c1560aa0-d06c-4c98-80bf-0635065cac6f","Type":"ContainerStarted","Data":"b5cc1d310f5936a12d77118aa2706f1466f8b8c6a17cb8c62f755dbd76355a35"} Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.541916 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tt28r" podStartSLOduration=4.014428909 podStartE2EDuration="47.541899028s" podCreationTimestamp="2026-03-20 16:22:08 +0000 UTC" firstStartedPulling="2026-03-20 16:22:10.69762931 +0000 UTC m=+1250.731258847" lastFinishedPulling="2026-03-20 16:22:54.225099429 +0000 UTC m=+1294.258728966" observedRunningTime="2026-03-20 16:22:55.53944335 +0000 UTC m=+1295.573072897" watchObservedRunningTime="2026-03-20 16:22:55.541899028 +0000 UTC m=+1295.575528565" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.542683 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5f7ccf99c9-m6s8x" podStartSLOduration=8.54267321 podStartE2EDuration="8.54267321s" podCreationTimestamp="2026-03-20 16:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:55.516293946 +0000 UTC m=+1295.549923493" watchObservedRunningTime="2026-03-20 16:22:55.54267321 +0000 UTC m=+1295.576302737" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.566801 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hhfc9" podStartSLOduration=2.83976689 podStartE2EDuration="46.566783171s" podCreationTimestamp="2026-03-20 16:22:09 +0000 UTC" firstStartedPulling="2026-03-20 16:22:10.435130156 +0000 UTC m=+1250.468759693" lastFinishedPulling="2026-03-20 16:22:54.162146437 +0000 UTC m=+1294.195775974" observedRunningTime="2026-03-20 16:22:55.561094182 +0000 UTC m=+1295.594723719" watchObservedRunningTime="2026-03-20 16:22:55.566783171 +0000 UTC m=+1295.600412708" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.761178 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.763033 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.814508 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:22:55 crc kubenswrapper[4675]: I0320 16:22:55.827995 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:22:56 crc kubenswrapper[4675]: I0320 16:22:56.574112 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cf47f4dbd-zbrxj" event={"ID":"faf8d346-4de8-473c-8d9c-a5d1cf895e4e","Type":"ContainerStarted","Data":"5b85fa6f6294fc931473ed0a0e9215cf8c2af403e6c9461af32245979021556a"} Mar 20 16:22:56 crc kubenswrapper[4675]: I0320 16:22:56.574196 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cf47f4dbd-zbrxj" event={"ID":"faf8d346-4de8-473c-8d9c-a5d1cf895e4e","Type":"ContainerStarted","Data":"5878628c61beebc2df0c6bb54d5c0314d81513a87a7fef5576c357b98eac5b05"} Mar 20 16:22:56 crc kubenswrapper[4675]: I0320 16:22:56.575231 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:22:56 crc kubenswrapper[4675]: I0320 16:22:56.575279 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:22:56 crc kubenswrapper[4675]: I0320 16:22:56.685649 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" path="/var/lib/kubelet/pods/a85fb28f-af43-4d9b-acf3-71c4f4494daa/volumes" Mar 20 16:22:57 crc kubenswrapper[4675]: I0320 16:22:57.590619 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:57 crc kubenswrapper[4675]: I0320 16:22:57.590900 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.604729 4675 generic.go:334] "Generic (PLEG): container finished" podID="c1560aa0-d06c-4c98-80bf-0635065cac6f" containerID="b5cc1d310f5936a12d77118aa2706f1466f8b8c6a17cb8c62f755dbd76355a35" exitCode=0 Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.607017 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hhfc9" event={"ID":"c1560aa0-d06c-4c98-80bf-0635065cac6f","Type":"ContainerDied","Data":"b5cc1d310f5936a12d77118aa2706f1466f8b8c6a17cb8c62f755dbd76355a35"} Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.631652 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-cf47f4dbd-zbrxj" podStartSLOduration=9.631634777 podStartE2EDuration="9.631634777s" podCreationTimestamp="2026-03-20 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:56.610564032 +0000 UTC m=+1296.644193579" watchObservedRunningTime="2026-03-20 16:22:58.631634777 +0000 UTC m=+1298.665264314" Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.707216 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c454cc68b-lmjfb" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.747923 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.748091 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.749201 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:22:58 crc kubenswrapper[4675]: I0320 16:22:58.825755 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d79d4db6d-vnw9g" podUID="b81ab73e-24ce-451b-9064-d6ebea2c5976" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 20 16:23:00 crc kubenswrapper[4675]: I0320 16:23:00.629894 4675 generic.go:334] "Generic (PLEG): container finished" podID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" containerID="c4694069148f8729d13a12a9d94b6b5be6dd7d4973f260458e56c3d138e7f773" exitCode=0 Mar 20 16:23:00 crc kubenswrapper[4675]: I0320 16:23:00.629961 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt28r" event={"ID":"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43","Type":"ContainerDied","Data":"c4694069148f8729d13a12a9d94b6b5be6dd7d4973f260458e56c3d138e7f773"} Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.475281 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt28r" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609429 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-etc-machine-id\") pod \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609526 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkck2\" (UniqueName: \"kubernetes.io/projected/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-kube-api-access-zkck2\") pod \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609569 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" (UID: "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-scripts\") pod \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609728 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-config-data\") pod \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609912 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-db-sync-config-data\") pod \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.609989 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-combined-ca-bundle\") pod \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\" (UID: \"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43\") " Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.610501 4675 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.630544 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-scripts" (OuterVolumeSpecName: "scripts") pod "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" (UID: "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.630589 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-kube-api-access-zkck2" (OuterVolumeSpecName: "kube-api-access-zkck2") pod "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" (UID: "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43"). InnerVolumeSpecName "kube-api-access-zkck2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.630971 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" (UID: "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.661733 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt28r" event={"ID":"37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43","Type":"ContainerDied","Data":"7afe2c666ee5b6c44f7514ae19b7f082130de6a94648181d09da106366ddbf16"} Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.661799 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7afe2c666ee5b6c44f7514ae19b7f082130de6a94648181d09da106366ddbf16" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.661862 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt28r" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.666091 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-config-data" (OuterVolumeSpecName: "config-data") pod "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" (UID: "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.674919 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" (UID: "37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.712410 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.712447 4675 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.712462 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.712474 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkck2\" (UniqueName: \"kubernetes.io/projected/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-kube-api-access-zkck2\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4675]: I0320 16:23:03.712483 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.028435 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.118185 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-combined-ca-bundle\") pod \"c1560aa0-d06c-4c98-80bf-0635065cac6f\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.118262 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5927\" (UniqueName: \"kubernetes.io/projected/c1560aa0-d06c-4c98-80bf-0635065cac6f-kube-api-access-h5927\") pod \"c1560aa0-d06c-4c98-80bf-0635065cac6f\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.118389 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-db-sync-config-data\") pod \"c1560aa0-d06c-4c98-80bf-0635065cac6f\" (UID: \"c1560aa0-d06c-4c98-80bf-0635065cac6f\") " Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.123407 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c1560aa0-d06c-4c98-80bf-0635065cac6f" (UID: "c1560aa0-d06c-4c98-80bf-0635065cac6f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.124171 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1560aa0-d06c-4c98-80bf-0635065cac6f-kube-api-access-h5927" (OuterVolumeSpecName: "kube-api-access-h5927") pod "c1560aa0-d06c-4c98-80bf-0635065cac6f" (UID: "c1560aa0-d06c-4c98-80bf-0635065cac6f"). InnerVolumeSpecName "kube-api-access-h5927". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.143614 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1560aa0-d06c-4c98-80bf-0635065cac6f" (UID: "c1560aa0-d06c-4c98-80bf-0635065cac6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.220094 4675 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.220133 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1560aa0-d06c-4c98-80bf-0635065cac6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.220145 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5927\" (UniqueName: \"kubernetes.io/projected/c1560aa0-d06c-4c98-80bf-0635065cac6f-kube-api-access-h5927\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.674669 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-central-agent" containerID="cri-o://cfae4fcd530731710080b8d955c3d7b3698573155592d864f9bc377d876d1c7f" gracePeriod=30 Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.674697 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="proxy-httpd" containerID="cri-o://c811afbc27348c3f822de2016130abdb0bcf03cac3214cca5ba3c0b8a81a6f27" gracePeriod=30 Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.674706 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="sg-core" containerID="cri-o://6c888e5219681ab88ebaa1c9f5d19285794279a36f3d1138e85af33547fc2ced" gracePeriod=30 Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.674721 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-notification-agent" containerID="cri-o://2ba333a09bb26c779a9007ef8f5f2b0f1a347e974a46a84d5905bff9abf957d7" gracePeriod=30 Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.687103 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hhfc9" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.709240 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.077731445 podStartE2EDuration="56.709215401s" podCreationTimestamp="2026-03-20 16:22:08 +0000 UTC" firstStartedPulling="2026-03-20 16:22:10.510665637 +0000 UTC m=+1250.544295174" lastFinishedPulling="2026-03-20 16:23:04.142149593 +0000 UTC m=+1304.175779130" observedRunningTime="2026-03-20 16:23:04.707963176 +0000 UTC m=+1304.741592723" watchObservedRunningTime="2026-03-20 16:23:04.709215401 +0000 UTC m=+1304.742844938" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.827140 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerStarted","Data":"c811afbc27348c3f822de2016130abdb0bcf03cac3214cca5ba3c0b8a81a6f27"} Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.827181 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hhfc9" event={"ID":"c1560aa0-d06c-4c98-80bf-0635065cac6f","Type":"ContainerDied","Data":"9ae74317edd09801769caa5ebd25dafd2ffdb7b049cc42a62d9ad7d4280276ee"} Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.827193 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae74317edd09801769caa5ebd25dafd2ffdb7b049cc42a62d9ad7d4280276ee" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.827208 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.849848 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:04 crc kubenswrapper[4675]: E0320 16:23:04.854603 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1560aa0-d06c-4c98-80bf-0635065cac6f" containerName="barbican-db-sync" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.854630 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1560aa0-d06c-4c98-80bf-0635065cac6f" containerName="barbican-db-sync" Mar 20 16:23:04 crc kubenswrapper[4675]: E0320 16:23:04.854679 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" containerName="cinder-db-sync" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.854686 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" containerName="cinder-db-sync" Mar 20 16:23:04 crc kubenswrapper[4675]: E0320 16:23:04.854789 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="dnsmasq-dns" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.854796 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="dnsmasq-dns" Mar 20 16:23:04 crc kubenswrapper[4675]: E0320 16:23:04.854810 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="init" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.854844 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="init" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.855556 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85fb28f-af43-4d9b-acf3-71c4f4494daa" containerName="dnsmasq-dns" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.855635 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1560aa0-d06c-4c98-80bf-0635065cac6f" containerName="barbican-db-sync" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.855714 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" containerName="cinder-db-sync" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.876992 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.880012 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xpldv" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.883750 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.883814 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.890244 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.902705 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.932837 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-h52lm"] Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.934619 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.960240 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-h52lm"] Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.981523 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.981619 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjw6\" (UniqueName: \"kubernetes.io/projected/b6d08d76-57ad-48b8-98d5-cc802e2b7194-kube-api-access-ffjw6\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.981662 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.981696 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.981794 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.981850 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6d08d76-57ad-48b8-98d5-cc802e2b7194-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.983075 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.984678 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.988247 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 16:23:04 crc kubenswrapper[4675]: I0320 16:23:04.993457 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.083803 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjw6\" (UniqueName: \"kubernetes.io/projected/b6d08d76-57ad-48b8-98d5-cc802e2b7194-kube-api-access-ffjw6\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.083860 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.083894 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-scripts\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.083914 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.083937 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99f9365b-39a0-4a6f-878c-15de1753bbbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.083967 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084003 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5tnz\" (UniqueName: \"kubernetes.io/projected/99f9365b-39a0-4a6f-878c-15de1753bbbb-kube-api-access-b5tnz\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084029 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f9365b-39a0-4a6f-878c-15de1753bbbb-logs\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084044 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-config\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084060 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084074 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk8n5\" (UniqueName: \"kubernetes.io/projected/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-kube-api-access-jk8n5\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084111 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084150 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6d08d76-57ad-48b8-98d5-cc802e2b7194-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084173 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084198 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084231 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084252 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084265 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084282 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.084951 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6d08d76-57ad-48b8-98d5-cc802e2b7194-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.106972 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.108496 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.108527 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.110901 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.142504 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjw6\" (UniqueName: \"kubernetes.io/projected/b6d08d76-57ad-48b8-98d5-cc802e2b7194-kube-api-access-ffjw6\") pod \"cinder-scheduler-0\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186301 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186375 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5tnz\" (UniqueName: \"kubernetes.io/projected/99f9365b-39a0-4a6f-878c-15de1753bbbb-kube-api-access-b5tnz\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186598 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f9365b-39a0-4a6f-878c-15de1753bbbb-logs\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186623 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-config\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186647 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk8n5\" (UniqueName: \"kubernetes.io/projected/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-kube-api-access-jk8n5\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186676 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186723 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186752 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186813 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186834 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186860 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186936 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-scripts\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.186969 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99f9365b-39a0-4a6f-878c-15de1753bbbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.187064 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99f9365b-39a0-4a6f-878c-15de1753bbbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.187201 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f9365b-39a0-4a6f-878c-15de1753bbbb-logs\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.187554 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.187685 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-config\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.187854 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.188223 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.188231 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.195086 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.200254 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.200460 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-scripts\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.201609 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.207541 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5tnz\" (UniqueName: \"kubernetes.io/projected/99f9365b-39a0-4a6f-878c-15de1753bbbb-kube-api-access-b5tnz\") pod \"cinder-api-0\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.210925 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk8n5\" (UniqueName: \"kubernetes.io/projected/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-kube-api-access-jk8n5\") pod \"dnsmasq-dns-d68b9cb4c-h52lm\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.223205 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.265119 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.344258 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.367593 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-cc6bc59f9-6rgrv"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.369468 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.373737 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-scdtr" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.373926 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.376620 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.417928 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64fc78bb94-nfn76"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.419371 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.423847 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.467537 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cc6bc59f9-6rgrv"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495359 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8twp\" (UniqueName: \"kubernetes.io/projected/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-kube-api-access-n8twp\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495455 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-config-data-custom\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495508 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca93e3f6-95e2-4973-8100-94e89ad3515b-logs\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495527 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-combined-ca-bundle\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495552 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-config-data-custom\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495592 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-config-data\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495635 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-logs\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.495659 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-combined-ca-bundle\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.507211 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzmhr\" (UniqueName: \"kubernetes.io/projected/ca93e3f6-95e2-4973-8100-94e89ad3515b-kube-api-access-jzmhr\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.507968 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-config-data\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.523544 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64fc78bb94-nfn76"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.546194 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-h52lm"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.605887 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2dwd4"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.610558 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611369 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca93e3f6-95e2-4973-8100-94e89ad3515b-logs\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611407 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-combined-ca-bundle\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611433 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-config-data-custom\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611485 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-config-data\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611531 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-logs\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611547 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-combined-ca-bundle\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611577 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzmhr\" (UniqueName: \"kubernetes.io/projected/ca93e3f6-95e2-4973-8100-94e89ad3515b-kube-api-access-jzmhr\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611633 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-config-data\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611669 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8twp\" (UniqueName: \"kubernetes.io/projected/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-kube-api-access-n8twp\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.611702 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-config-data-custom\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.613856 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca93e3f6-95e2-4973-8100-94e89ad3515b-logs\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.614216 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-logs\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.624251 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2dwd4"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.638901 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7568b66c46-pphb7"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.640673 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.648431 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-combined-ca-bundle\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.649960 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7568b66c46-pphb7"] Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.651054 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.655914 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-config-data\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.656294 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-config-data-custom\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.662396 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-config-data-custom\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.663324 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzmhr\" (UniqueName: \"kubernetes.io/projected/ca93e3f6-95e2-4973-8100-94e89ad3515b-kube-api-access-jzmhr\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.663969 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca93e3f6-95e2-4973-8100-94e89ad3515b-config-data\") pod \"barbican-keystone-listener-64fc78bb94-nfn76\" (UID: \"ca93e3f6-95e2-4973-8100-94e89ad3515b\") " pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.664946 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8twp\" (UniqueName: \"kubernetes.io/projected/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-kube-api-access-n8twp\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.669355 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9-combined-ca-bundle\") pod \"barbican-worker-cc6bc59f9-6rgrv\" (UID: \"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9\") " pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713474 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713591 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvdth\" (UniqueName: \"kubernetes.io/projected/a5ae6534-550e-4564-8ade-613dfbe1fa32-kube-api-access-lvdth\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713654 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-config\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713706 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data-custom\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713745 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-svc\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713779 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713815 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-combined-ca-bundle\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713859 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713887 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713915 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qv9m\" (UniqueName: \"kubernetes.io/projected/9da42c2c-0330-4ba7-9b59-46d4d84fe045-kube-api-access-6qv9m\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.713948 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ae6534-550e-4564-8ade-613dfbe1fa32-logs\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.759840 4675 generic.go:334] "Generic (PLEG): container finished" podID="7497e477-249c-4346-9087-458ba9e6c152" containerID="c811afbc27348c3f822de2016130abdb0bcf03cac3214cca5ba3c0b8a81a6f27" exitCode=0 Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.759883 4675 generic.go:334] "Generic (PLEG): container finished" podID="7497e477-249c-4346-9087-458ba9e6c152" containerID="6c888e5219681ab88ebaa1c9f5d19285794279a36f3d1138e85af33547fc2ced" exitCode=2 Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.759896 4675 generic.go:334] "Generic (PLEG): container finished" podID="7497e477-249c-4346-9087-458ba9e6c152" containerID="cfae4fcd530731710080b8d955c3d7b3698573155592d864f9bc377d876d1c7f" exitCode=0 Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.759921 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerDied","Data":"c811afbc27348c3f822de2016130abdb0bcf03cac3214cca5ba3c0b8a81a6f27"} Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.759952 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerDied","Data":"6c888e5219681ab88ebaa1c9f5d19285794279a36f3d1138e85af33547fc2ced"} Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.759968 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerDied","Data":"cfae4fcd530731710080b8d955c3d7b3698573155592d864f9bc377d876d1c7f"} Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.777240 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cc6bc59f9-6rgrv" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.798661 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.816590 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.816941 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.816986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qv9m\" (UniqueName: \"kubernetes.io/projected/9da42c2c-0330-4ba7-9b59-46d4d84fe045-kube-api-access-6qv9m\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817030 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ae6534-550e-4564-8ade-613dfbe1fa32-logs\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817099 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817187 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvdth\" (UniqueName: \"kubernetes.io/projected/a5ae6534-550e-4564-8ade-613dfbe1fa32-kube-api-access-lvdth\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817281 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-config\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817351 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data-custom\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817401 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-svc\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817420 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.817479 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-combined-ca-bundle\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.823642 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.823681 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-svc\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.824251 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.824783 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-config\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.825259 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.825554 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ae6534-550e-4564-8ade-613dfbe1fa32-logs\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.828279 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data-custom\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.835166 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-combined-ca-bundle\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.835233 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.842496 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qv9m\" (UniqueName: \"kubernetes.io/projected/9da42c2c-0330-4ba7-9b59-46d4d84fe045-kube-api-access-6qv9m\") pod \"dnsmasq-dns-5784cf869f-2dwd4\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.857445 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvdth\" (UniqueName: \"kubernetes.io/projected/a5ae6534-550e-4564-8ade-613dfbe1fa32-kube-api-access-lvdth\") pod \"barbican-api-7568b66c46-pphb7\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:05 crc kubenswrapper[4675]: I0320 16:23:05.886879 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.092522 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:06 crc kubenswrapper[4675]: W0320 16:23:06.106272 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d08d76_57ad_48b8_98d5_cc802e2b7194.slice/crio-5d09e7fafedf66bb977fd06abd5dea02e3e4afb100090b50b6f5c8c5d511f56f WatchSource:0}: Error finding container 5d09e7fafedf66bb977fd06abd5dea02e3e4afb100090b50b6f5c8c5d511f56f: Status 404 returned error can't find the container with id 5d09e7fafedf66bb977fd06abd5dea02e3e4afb100090b50b6f5c8c5d511f56f Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.107199 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:06 crc kubenswrapper[4675]: W0320 16:23:06.311044 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35fb26f0_e62c_4512_b4ea_6783b4a61c4e.slice/crio-95347f2c634ffed3b4bd442f5f6d813f4b8a51dcaffc1b774c00e7c6fe3a6ca3 WatchSource:0}: Error finding container 95347f2c634ffed3b4bd442f5f6d813f4b8a51dcaffc1b774c00e7c6fe3a6ca3: Status 404 returned error can't find the container with id 95347f2c634ffed3b4bd442f5f6d813f4b8a51dcaffc1b774c00e7c6fe3a6ca3 Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.312270 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-h52lm"] Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.414806 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:06 crc kubenswrapper[4675]: W0320 16:23:06.427900 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f9365b_39a0_4a6f_878c_15de1753bbbb.slice/crio-5d3cd052d20fcad0b588d46de93a5e06751d79c4e204738287742ac14991b456 WatchSource:0}: Error finding container 5d3cd052d20fcad0b588d46de93a5e06751d79c4e204738287742ac14991b456: Status 404 returned error can't find the container with id 5d3cd052d20fcad0b588d46de93a5e06751d79c4e204738287742ac14991b456 Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.544059 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cc6bc59f9-6rgrv"] Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.585729 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64fc78bb94-nfn76"] Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.688868 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7568b66c46-pphb7"] Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.757678 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2dwd4"] Mar 20 16:23:06 crc kubenswrapper[4675]: W0320 16:23:06.768667 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9da42c2c_0330_4ba7_9b59_46d4d84fe045.slice/crio-8cb1cc99aff2e17049118026ad299a1f50e37b657fbc0d3787b459c0ffeb89b2 WatchSource:0}: Error finding container 8cb1cc99aff2e17049118026ad299a1f50e37b657fbc0d3787b459c0ffeb89b2: Status 404 returned error can't find the container with id 8cb1cc99aff2e17049118026ad299a1f50e37b657fbc0d3787b459c0ffeb89b2 Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.775640 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6d08d76-57ad-48b8-98d5-cc802e2b7194","Type":"ContainerStarted","Data":"5d09e7fafedf66bb977fd06abd5dea02e3e4afb100090b50b6f5c8c5d511f56f"} Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.777244 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" event={"ID":"ca93e3f6-95e2-4973-8100-94e89ad3515b","Type":"ContainerStarted","Data":"401f55e37ee5ef06e2f0078c9299caf510de2acbc4ca0f4158fcdf16f421aec1"} Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.779398 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" event={"ID":"35fb26f0-e62c-4512-b4ea-6783b4a61c4e","Type":"ContainerStarted","Data":"cf7887f1d141b695f121e7d572b7bf756afcb691fd7f4c8fa0c4708f5b4b512b"} Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.779432 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" event={"ID":"35fb26f0-e62c-4512-b4ea-6783b4a61c4e","Type":"ContainerStarted","Data":"95347f2c634ffed3b4bd442f5f6d813f4b8a51dcaffc1b774c00e7c6fe3a6ca3"} Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.779466 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" podUID="35fb26f0-e62c-4512-b4ea-6783b4a61c4e" containerName="init" containerID="cri-o://cf7887f1d141b695f121e7d572b7bf756afcb691fd7f4c8fa0c4708f5b4b512b" gracePeriod=10 Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.781427 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc6bc59f9-6rgrv" event={"ID":"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9","Type":"ContainerStarted","Data":"01676edbf51656b2b6e0acf445e7fb950de9119fe0c972cdf267ffbb51942a08"} Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.783645 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7568b66c46-pphb7" event={"ID":"a5ae6534-550e-4564-8ade-613dfbe1fa32","Type":"ContainerStarted","Data":"8611cb742b9eed1767b4244f69ef7878c73f56022b003827c72f43f2cba81a52"} Mar 20 16:23:06 crc kubenswrapper[4675]: I0320 16:23:06.785353 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"99f9365b-39a0-4a6f-878c-15de1753bbbb","Type":"ContainerStarted","Data":"5d3cd052d20fcad0b588d46de93a5e06751d79c4e204738287742ac14991b456"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.198508 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.818126 4675 generic.go:334] "Generic (PLEG): container finished" podID="35fb26f0-e62c-4512-b4ea-6783b4a61c4e" containerID="cf7887f1d141b695f121e7d572b7bf756afcb691fd7f4c8fa0c4708f5b4b512b" exitCode=0 Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.818233 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" event={"ID":"35fb26f0-e62c-4512-b4ea-6783b4a61c4e","Type":"ContainerDied","Data":"cf7887f1d141b695f121e7d572b7bf756afcb691fd7f4c8fa0c4708f5b4b512b"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.818688 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" event={"ID":"35fb26f0-e62c-4512-b4ea-6783b4a61c4e","Type":"ContainerDied","Data":"95347f2c634ffed3b4bd442f5f6d813f4b8a51dcaffc1b774c00e7c6fe3a6ca3"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.818713 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95347f2c634ffed3b4bd442f5f6d813f4b8a51dcaffc1b774c00e7c6fe3a6ca3" Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.822521 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7568b66c46-pphb7" event={"ID":"a5ae6534-550e-4564-8ade-613dfbe1fa32","Type":"ContainerStarted","Data":"05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.822563 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7568b66c46-pphb7" event={"ID":"a5ae6534-550e-4564-8ade-613dfbe1fa32","Type":"ContainerStarted","Data":"e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.822804 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.824995 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"99f9365b-39a0-4a6f-878c-15de1753bbbb","Type":"ContainerStarted","Data":"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.830620 4675 generic.go:334] "Generic (PLEG): container finished" podID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerID="ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc" exitCode=0 Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.830668 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" event={"ID":"9da42c2c-0330-4ba7-9b59-46d4d84fe045","Type":"ContainerDied","Data":"ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.830697 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" event={"ID":"9da42c2c-0330-4ba7-9b59-46d4d84fe045","Type":"ContainerStarted","Data":"8cb1cc99aff2e17049118026ad299a1f50e37b657fbc0d3787b459c0ffeb89b2"} Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.845853 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7568b66c46-pphb7" podStartSLOduration=2.845829895 podStartE2EDuration="2.845829895s" podCreationTimestamp="2026-03-20 16:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:07.837211235 +0000 UTC m=+1307.870840772" watchObservedRunningTime="2026-03-20 16:23:07.845829895 +0000 UTC m=+1307.879459452" Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.862060 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.964752 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-sb\") pod \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.964923 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-config\") pod \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.965348 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk8n5\" (UniqueName: \"kubernetes.io/projected/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-kube-api-access-jk8n5\") pod \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.965387 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-swift-storage-0\") pod \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.965517 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-svc\") pod \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.965600 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-nb\") pod \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\" (UID: \"35fb26f0-e62c-4512-b4ea-6783b4a61c4e\") " Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.968508 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-kube-api-access-jk8n5" (OuterVolumeSpecName: "kube-api-access-jk8n5") pod "35fb26f0-e62c-4512-b4ea-6783b4a61c4e" (UID: "35fb26f0-e62c-4512-b4ea-6783b4a61c4e"). InnerVolumeSpecName "kube-api-access-jk8n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:07 crc kubenswrapper[4675]: I0320 16:23:07.992294 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35fb26f0-e62c-4512-b4ea-6783b4a61c4e" (UID: "35fb26f0-e62c-4512-b4ea-6783b4a61c4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:07.998978 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "35fb26f0-e62c-4512-b4ea-6783b4a61c4e" (UID: "35fb26f0-e62c-4512-b4ea-6783b4a61c4e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.000655 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35fb26f0-e62c-4512-b4ea-6783b4a61c4e" (UID: "35fb26f0-e62c-4512-b4ea-6783b4a61c4e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.005852 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35fb26f0-e62c-4512-b4ea-6783b4a61c4e" (UID: "35fb26f0-e62c-4512-b4ea-6783b4a61c4e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.022878 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-config" (OuterVolumeSpecName: "config") pod "35fb26f0-e62c-4512-b4ea-6783b4a61c4e" (UID: "35fb26f0-e62c-4512-b4ea-6783b4a61c4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.068114 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.068148 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.068159 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.068168 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.068177 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk8n5\" (UniqueName: \"kubernetes.io/projected/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-kube-api-access-jk8n5\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.068187 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35fb26f0-e62c-4512-b4ea-6783b4a61c4e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.852406 4675 generic.go:334] "Generic (PLEG): container finished" podID="7497e477-249c-4346-9087-458ba9e6c152" containerID="2ba333a09bb26c779a9007ef8f5f2b0f1a347e974a46a84d5905bff9abf957d7" exitCode=0 Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.852731 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerDied","Data":"2ba333a09bb26c779a9007ef8f5f2b0f1a347e974a46a84d5905bff9abf957d7"} Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.855456 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"99f9365b-39a0-4a6f-878c-15de1753bbbb","Type":"ContainerStarted","Data":"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f"} Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.855537 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-h52lm" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.855614 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api-log" containerID="cri-o://4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae" gracePeriod=30 Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.855693 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api" containerID="cri-o://b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f" gracePeriod=30 Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.856241 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.856270 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.889777 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.889741591 podStartE2EDuration="4.889741591s" podCreationTimestamp="2026-03-20 16:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:08.881518542 +0000 UTC m=+1308.915148079" watchObservedRunningTime="2026-03-20 16:23:08.889741591 +0000 UTC m=+1308.923371128" Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.935111 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-h52lm"] Mar 20 16:23:08 crc kubenswrapper[4675]: I0320 16:23:08.968203 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-h52lm"] Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.338356 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.399854 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gltg\" (UniqueName: \"kubernetes.io/projected/7497e477-249c-4346-9087-458ba9e6c152-kube-api-access-8gltg\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.400203 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-scripts\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.400252 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-config-data\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.400302 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-run-httpd\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.400376 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-log-httpd\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.400406 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-combined-ca-bundle\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.400468 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-sg-core-conf-yaml\") pod \"7497e477-249c-4346-9087-458ba9e6c152\" (UID: \"7497e477-249c-4346-9087-458ba9e6c152\") " Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.408917 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.409281 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.422134 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7497e477-249c-4346-9087-458ba9e6c152-kube-api-access-8gltg" (OuterVolumeSpecName: "kube-api-access-8gltg") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "kube-api-access-8gltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.424004 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-scripts" (OuterVolumeSpecName: "scripts") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.458974 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.502359 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gltg\" (UniqueName: \"kubernetes.io/projected/7497e477-249c-4346-9087-458ba9e6c152-kube-api-access-8gltg\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.502384 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.502393 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.502402 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7497e477-249c-4346-9087-458ba9e6c152-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.502410 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.523558 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.548983 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-config-data" (OuterVolumeSpecName: "config-data") pod "7497e477-249c-4346-9087-458ba9e6c152" (UID: "7497e477-249c-4346-9087-458ba9e6c152"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.581816 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.610652 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.610681 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7497e477-249c-4346-9087-458ba9e6c152-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.841641 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b6db44d49-fvfv9"] Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.842450 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b6db44d49-fvfv9" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-api" containerID="cri-o://e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5" gracePeriod=30 Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.842702 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b6db44d49-fvfv9" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-httpd" containerID="cri-o://43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1" gracePeriod=30 Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.880621 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f9ffbdb49-tcvn8"] Mar 20 16:23:09 crc kubenswrapper[4675]: E0320 16:23:09.882485 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-central-agent" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.882650 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-central-agent" Mar 20 16:23:09 crc kubenswrapper[4675]: E0320 16:23:09.882741 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fb26f0-e62c-4512-b4ea-6783b4a61c4e" containerName="init" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.882834 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fb26f0-e62c-4512-b4ea-6783b4a61c4e" containerName="init" Mar 20 16:23:09 crc kubenswrapper[4675]: E0320 16:23:09.882979 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-notification-agent" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.883088 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-notification-agent" Mar 20 16:23:09 crc kubenswrapper[4675]: E0320 16:23:09.883180 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="proxy-httpd" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.883380 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="proxy-httpd" Mar 20 16:23:09 crc kubenswrapper[4675]: E0320 16:23:09.883483 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="sg-core" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.883591 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="sg-core" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.883933 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-central-agent" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.885344 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="ceilometer-notification-agent" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.885458 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="sg-core" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.885572 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7497e477-249c-4346-9087-458ba9e6c152" containerName="proxy-httpd" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.885653 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fb26f0-e62c-4512-b4ea-6783b4a61c4e" containerName="init" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.889598 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.903047 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-b6db44d49-fvfv9" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": EOF" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.908870 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f9ffbdb49-tcvn8"] Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916032 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-httpd-config\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916098 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-public-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916147 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrbk4\" (UniqueName: \"kubernetes.io/projected/c411bc97-1398-438f-a194-1d53f896f405-kube-api-access-hrbk4\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916173 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-internal-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916189 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-config\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916256 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-combined-ca-bundle\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.916274 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-ovndb-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.921372 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" event={"ID":"9da42c2c-0330-4ba7-9b59-46d4d84fe045","Type":"ContainerStarted","Data":"0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b"} Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.922646 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.959756 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" podStartSLOduration=4.959742074 podStartE2EDuration="4.959742074s" podCreationTimestamp="2026-03-20 16:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:09.956985277 +0000 UTC m=+1309.990614814" watchObservedRunningTime="2026-03-20 16:23:09.959742074 +0000 UTC m=+1309.993371611" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.964361 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.964622 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" event={"ID":"ca93e3f6-95e2-4973-8100-94e89ad3515b","Type":"ContainerStarted","Data":"445b61f9c5146186660a7f88ac96dff4429e8b3264dfe7fd202101ab325b798b"} Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.971708 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc6bc59f9-6rgrv" event={"ID":"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9","Type":"ContainerStarted","Data":"dbb2bf0916e12a08be61498a126eada90ece1a8b91db9f75403fffec3cf96c0a"} Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.978915 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.979719 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7497e477-249c-4346-9087-458ba9e6c152","Type":"ContainerDied","Data":"eda9251422a7a196a30fbe6be7185712c179311a552f52b6c1a0f12fdbe7ea58"} Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.979784 4675 scope.go:117] "RemoveContainer" containerID="c811afbc27348c3f822de2016130abdb0bcf03cac3214cca5ba3c0b8a81a6f27" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.997118 4675 generic.go:334] "Generic (PLEG): container finished" podID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerID="b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f" exitCode=0 Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.997160 4675 generic.go:334] "Generic (PLEG): container finished" podID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerID="4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae" exitCode=143 Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.998112 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.998278 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"99f9365b-39a0-4a6f-878c-15de1753bbbb","Type":"ContainerDied","Data":"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f"} Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.998315 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"99f9365b-39a0-4a6f-878c-15de1753bbbb","Type":"ContainerDied","Data":"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae"} Mar 20 16:23:09 crc kubenswrapper[4675]: I0320 16:23:09.998327 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"99f9365b-39a0-4a6f-878c-15de1753bbbb","Type":"ContainerDied","Data":"5d3cd052d20fcad0b588d46de93a5e06751d79c4e204738287742ac14991b456"} Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.032953 4675 scope.go:117] "RemoveContainer" containerID="6c888e5219681ab88ebaa1c9f5d19285794279a36f3d1138e85af33547fc2ced" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.044354 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-combined-ca-bundle\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.044504 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-ovndb-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.044914 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-httpd-config\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.045341 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-public-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.045496 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrbk4\" (UniqueName: \"kubernetes.io/projected/c411bc97-1398-438f-a194-1d53f896f405-kube-api-access-hrbk4\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.045558 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-internal-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.045583 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-config\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.051375 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-combined-ca-bundle\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.051432 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-httpd-config\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.066700 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-public-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.068234 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-internal-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.086126 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-ovndb-tls-certs\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.091017 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-cc6bc59f9-6rgrv" podStartSLOduration=2.754975597 podStartE2EDuration="5.090992356s" podCreationTimestamp="2026-03-20 16:23:05 +0000 UTC" firstStartedPulling="2026-03-20 16:23:06.597948023 +0000 UTC m=+1306.631577560" lastFinishedPulling="2026-03-20 16:23:08.933964782 +0000 UTC m=+1308.967594319" observedRunningTime="2026-03-20 16:23:10.046356784 +0000 UTC m=+1310.079986331" watchObservedRunningTime="2026-03-20 16:23:10.090992356 +0000 UTC m=+1310.124621893" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.095391 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrbk4\" (UniqueName: \"kubernetes.io/projected/c411bc97-1398-438f-a194-1d53f896f405-kube-api-access-hrbk4\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.110254 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c411bc97-1398-438f-a194-1d53f896f405-config\") pod \"neutron-6f9ffbdb49-tcvn8\" (UID: \"c411bc97-1398-438f-a194-1d53f896f405\") " pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.124728 4675 scope.go:117] "RemoveContainer" containerID="2ba333a09bb26c779a9007ef8f5f2b0f1a347e974a46a84d5905bff9abf957d7" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147033 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data-custom\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147089 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-scripts\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147111 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99f9365b-39a0-4a6f-878c-15de1753bbbb-etc-machine-id\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147210 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f9365b-39a0-4a6f-878c-15de1753bbbb-logs\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147271 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-combined-ca-bundle\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147325 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5tnz\" (UniqueName: \"kubernetes.io/projected/99f9365b-39a0-4a6f-878c-15de1753bbbb-kube-api-access-b5tnz\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147432 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data\") pod \"99f9365b-39a0-4a6f-878c-15de1753bbbb\" (UID: \"99f9365b-39a0-4a6f-878c-15de1753bbbb\") " Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.147930 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99f9365b-39a0-4a6f-878c-15de1753bbbb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.148182 4675 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99f9365b-39a0-4a6f-878c-15de1753bbbb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.148909 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f9365b-39a0-4a6f-878c-15de1753bbbb-logs" (OuterVolumeSpecName: "logs") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.153375 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-scripts" (OuterVolumeSpecName: "scripts") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.154890 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.162288 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.178354 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f9365b-39a0-4a6f-878c-15de1753bbbb-kube-api-access-b5tnz" (OuterVolumeSpecName: "kube-api-access-b5tnz") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "kube-api-access-b5tnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.183918 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.200643 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: E0320 16:23:10.201241 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.201283 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api" Mar 20 16:23:10 crc kubenswrapper[4675]: E0320 16:23:10.201300 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api-log" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.201309 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api-log" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.201610 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.201635 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" containerName="cinder-api-log" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.203009 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.204390 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.208337 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.208879 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.211045 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.215988 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data" (OuterVolumeSpecName: "config-data") pod "99f9365b-39a0-4a6f-878c-15de1753bbbb" (UID: "99f9365b-39a0-4a6f-878c-15de1753bbbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.233482 4675 scope.go:117] "RemoveContainer" containerID="cfae4fcd530731710080b8d955c3d7b3698573155592d864f9bc377d876d1c7f" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250033 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250292 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-scripts\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250365 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250459 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtmzw\" (UniqueName: \"kubernetes.io/projected/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-kube-api-access-mtmzw\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250620 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-config-data\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250681 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250833 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250912 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f9365b-39a0-4a6f-878c-15de1753bbbb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250930 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250943 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5tnz\" (UniqueName: \"kubernetes.io/projected/99f9365b-39a0-4a6f-878c-15de1753bbbb-kube-api-access-b5tnz\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250955 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250967 4675 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.250978 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f9365b-39a0-4a6f-878c-15de1753bbbb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.261083 4675 scope.go:117] "RemoveContainer" containerID="b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.283975 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.289959 4675 scope.go:117] "RemoveContainer" containerID="4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353421 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-config-data\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353505 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353583 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353645 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353669 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-scripts\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353694 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.353741 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtmzw\" (UniqueName: \"kubernetes.io/projected/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-kube-api-access-mtmzw\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.356496 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.357451 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.357666 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.360538 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.360727 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-scripts\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.361331 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.393036 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-config-data\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.406439 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtmzw\" (UniqueName: \"kubernetes.io/projected/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-kube-api-access-mtmzw\") pod \"ceilometer-0\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.423036 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.428564 4675 scope.go:117] "RemoveContainer" containerID="b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.433422 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: E0320 16:23:10.435158 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f\": container with ID starting with b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f not found: ID does not exist" containerID="b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.435311 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f"} err="failed to get container status \"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f\": rpc error: code = NotFound desc = could not find container \"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f\": container with ID starting with b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f not found: ID does not exist" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.435482 4675 scope.go:117] "RemoveContainer" containerID="4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae" Mar 20 16:23:10 crc kubenswrapper[4675]: E0320 16:23:10.435790 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae\": container with ID starting with 4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae not found: ID does not exist" containerID="4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.435913 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae"} err="failed to get container status \"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae\": rpc error: code = NotFound desc = could not find container \"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae\": container with ID starting with 4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae not found: ID does not exist" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.436004 4675 scope.go:117] "RemoveContainer" containerID="b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.436324 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.438039 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f"} err="failed to get container status \"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f\": rpc error: code = NotFound desc = could not find container \"b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f\": container with ID starting with b53c2e86ac217d7c0e19e45896fe72bff136b92ff968a2989971b40831a9ea6f not found: ID does not exist" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.438080 4675 scope.go:117] "RemoveContainer" containerID="4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.439592 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae"} err="failed to get container status \"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae\": rpc error: code = NotFound desc = could not find container \"4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae\": container with ID starting with 4d9b8d600aae6ca9ed7647e9ccf692765498616f1842d0336ca49d99c14328ae not found: ID does not exist" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.439967 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.440250 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.440318 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.452510 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.532596 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.561924 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab423d6-a026-41a9-8ae7-7ab2339de5ef-logs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562388 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562459 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-config-data-custom\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562489 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562645 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-scripts\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562818 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqqkc\" (UniqueName: \"kubernetes.io/projected/cab423d6-a026-41a9-8ae7-7ab2339de5ef-kube-api-access-bqqkc\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562847 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562908 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-config-data\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.562966 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cab423d6-a026-41a9-8ae7-7ab2339de5ef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.665336 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab423d6-a026-41a9-8ae7-7ab2339de5ef-logs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.665403 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.666507 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-config-data-custom\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.666532 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.666569 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-scripts\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.666621 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqqkc\" (UniqueName: \"kubernetes.io/projected/cab423d6-a026-41a9-8ae7-7ab2339de5ef-kube-api-access-bqqkc\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.666646 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.666673 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-config-data\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.667117 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cab423d6-a026-41a9-8ae7-7ab2339de5ef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.667261 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cab423d6-a026-41a9-8ae7-7ab2339de5ef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.665891 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab423d6-a026-41a9-8ae7-7ab2339de5ef-logs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.678335 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-scripts\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.681503 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.681517 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.681937 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.682372 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-config-data-custom\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.685321 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab423d6-a026-41a9-8ae7-7ab2339de5ef-config-data\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.709910 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqqkc\" (UniqueName: \"kubernetes.io/projected/cab423d6-a026-41a9-8ae7-7ab2339de5ef-kube-api-access-bqqkc\") pod \"cinder-api-0\" (UID: \"cab423d6-a026-41a9-8ae7-7ab2339de5ef\") " pod="openstack/cinder-api-0" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.776715 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fb26f0-e62c-4512-b4ea-6783b4a61c4e" path="/var/lib/kubelet/pods/35fb26f0-e62c-4512-b4ea-6783b4a61c4e/volumes" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.777970 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7497e477-249c-4346-9087-458ba9e6c152" path="/var/lib/kubelet/pods/7497e477-249c-4346-9087-458ba9e6c152/volumes" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.778859 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f9365b-39a0-4a6f-878c-15de1753bbbb" path="/var/lib/kubelet/pods/99f9365b-39a0-4a6f-878c-15de1753bbbb/volumes" Mar 20 16:23:10 crc kubenswrapper[4675]: I0320 16:23:10.827243 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.019526 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" event={"ID":"ca93e3f6-95e2-4973-8100-94e89ad3515b","Type":"ContainerStarted","Data":"800e2a69720e4e82f05ab62c33358a581e5ff17d5113875f46e377711f5a8837"} Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.040084 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cc6bc59f9-6rgrv" event={"ID":"780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9","Type":"ContainerStarted","Data":"b3df1867701a82f148c232bbc489ca6e21a7c5bc584e6281a47eb02855853b6c"} Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.060519 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64fc78bb94-nfn76" podStartSLOduration=3.702536233 podStartE2EDuration="6.060503671s" podCreationTimestamp="2026-03-20 16:23:05 +0000 UTC" firstStartedPulling="2026-03-20 16:23:06.575625353 +0000 UTC m=+1306.609254890" lastFinishedPulling="2026-03-20 16:23:08.933592781 +0000 UTC m=+1308.967222328" observedRunningTime="2026-03-20 16:23:11.054134543 +0000 UTC m=+1311.087764090" watchObservedRunningTime="2026-03-20 16:23:11.060503671 +0000 UTC m=+1311.094133208" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.061927 4675 generic.go:334] "Generic (PLEG): container finished" podID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerID="91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817" exitCode=137 Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.061974 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf9f9b95c-kgcr5" event={"ID":"d94a2d76-92e2-4403-ad6d-e2124b400d78","Type":"ContainerDied","Data":"91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817"} Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.086576 4675 generic.go:334] "Generic (PLEG): container finished" podID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerID="43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1" exitCode=0 Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.086675 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6db44d49-fvfv9" event={"ID":"e7c2420c-dca0-41c9-8f28-79a5337c7444","Type":"ContainerDied","Data":"43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1"} Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.104221 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6d08d76-57ad-48b8-98d5-cc802e2b7194","Type":"ContainerStarted","Data":"df748cc02568643b3a9cce8a55dbb5a468996406070e638b5283f7c5472bd336"} Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.238614 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.271605 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f9ffbdb49-tcvn8"] Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.358727 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:23:11 crc kubenswrapper[4675]: W0320 16:23:11.368238 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc411bc97_1398_438f_a194_1d53f896f405.slice/crio-4ab73b34a04f781e39ebee32362c93bbe7d9cea766f6700e6ed8773e6704497d WatchSource:0}: Error finding container 4ab73b34a04f781e39ebee32362c93bbe7d9cea766f6700e6ed8773e6704497d: Status 404 returned error can't find the container with id 4ab73b34a04f781e39ebee32362c93bbe7d9cea766f6700e6ed8773e6704497d Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.477316 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.658072 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d6cd9476b-xwbkm"] Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.661680 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.669639 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.672532 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.695546 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d6cd9476b-xwbkm"] Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806614 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-config-data\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806658 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmt28\" (UniqueName: \"kubernetes.io/projected/32e344b5-b713-4104-ac00-0793fd3e94d9-kube-api-access-zmt28\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806692 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32e344b5-b713-4104-ac00-0793fd3e94d9-logs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806714 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-public-tls-certs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806754 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-config-data-custom\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806797 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-internal-tls-certs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.806822 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-combined-ca-bundle\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.882331 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.886343 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919289 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-config-data\") pod \"d94a2d76-92e2-4403-ad6d-e2124b400d78\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919360 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhrgg\" (UniqueName: \"kubernetes.io/projected/d94a2d76-92e2-4403-ad6d-e2124b400d78-kube-api-access-xhrgg\") pod \"d94a2d76-92e2-4403-ad6d-e2124b400d78\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919415 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d94a2d76-92e2-4403-ad6d-e2124b400d78-horizon-secret-key\") pod \"d94a2d76-92e2-4403-ad6d-e2124b400d78\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919460 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-scripts\") pod \"d94a2d76-92e2-4403-ad6d-e2124b400d78\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919562 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32e344b5-b713-4104-ac00-0793fd3e94d9-logs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919603 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-public-tls-certs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-config-data-custom\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919716 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-internal-tls-certs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919784 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-combined-ca-bundle\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919936 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-config-data\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.919981 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmt28\" (UniqueName: \"kubernetes.io/projected/32e344b5-b713-4104-ac00-0793fd3e94d9-kube-api-access-zmt28\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.935418 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-combined-ca-bundle\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.935688 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32e344b5-b713-4104-ac00-0793fd3e94d9-logs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.936420 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94a2d76-92e2-4403-ad6d-e2124b400d78-kube-api-access-xhrgg" (OuterVolumeSpecName: "kube-api-access-xhrgg") pod "d94a2d76-92e2-4403-ad6d-e2124b400d78" (UID: "d94a2d76-92e2-4403-ad6d-e2124b400d78"). InnerVolumeSpecName "kube-api-access-xhrgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.954279 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-b6db44d49-fvfv9" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.954981 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-internal-tls-certs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.955933 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-config-data\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.956432 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d94a2d76-92e2-4403-ad6d-e2124b400d78-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d94a2d76-92e2-4403-ad6d-e2124b400d78" (UID: "d94a2d76-92e2-4403-ad6d-e2124b400d78"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.968317 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-public-tls-certs\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.975069 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32e344b5-b713-4104-ac00-0793fd3e94d9-config-data-custom\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.984001 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmt28\" (UniqueName: \"kubernetes.io/projected/32e344b5-b713-4104-ac00-0793fd3e94d9-kube-api-access-zmt28\") pod \"barbican-api-7d6cd9476b-xwbkm\" (UID: \"32e344b5-b713-4104-ac00-0793fd3e94d9\") " pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:11 crc kubenswrapper[4675]: I0320 16:23:11.990711 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-config-data" (OuterVolumeSpecName: "config-data") pod "d94a2d76-92e2-4403-ad6d-e2124b400d78" (UID: "d94a2d76-92e2-4403-ad6d-e2124b400d78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.002230 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-scripts" (OuterVolumeSpecName: "scripts") pod "d94a2d76-92e2-4403-ad6d-e2124b400d78" (UID: "d94a2d76-92e2-4403-ad6d-e2124b400d78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.021341 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d94a2d76-92e2-4403-ad6d-e2124b400d78-logs\") pod \"d94a2d76-92e2-4403-ad6d-e2124b400d78\" (UID: \"d94a2d76-92e2-4403-ad6d-e2124b400d78\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.022026 4675 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d94a2d76-92e2-4403-ad6d-e2124b400d78-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.022052 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.022064 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d94a2d76-92e2-4403-ad6d-e2124b400d78-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.022076 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhrgg\" (UniqueName: \"kubernetes.io/projected/d94a2d76-92e2-4403-ad6d-e2124b400d78-kube-api-access-xhrgg\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.022486 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94a2d76-92e2-4403-ad6d-e2124b400d78-logs" (OuterVolumeSpecName: "logs") pod "d94a2d76-92e2-4403-ad6d-e2124b400d78" (UID: "d94a2d76-92e2-4403-ad6d-e2124b400d78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.062105 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.095648 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.127372 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c227a3-b831-440b-ab1a-4171217faf81-logs\") pod \"85c227a3-b831-440b-ab1a-4171217faf81\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.127734 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d94a2d76-92e2-4403-ad6d-e2124b400d78-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.127890 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c227a3-b831-440b-ab1a-4171217faf81-logs" (OuterVolumeSpecName: "logs") pod "85c227a3-b831-440b-ab1a-4171217faf81" (UID: "85c227a3-b831-440b-ab1a-4171217faf81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.134978 4675 generic.go:334] "Generic (PLEG): container finished" podID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerID="0ea697f15b863561ce440e2aacebf2bd03ae127718dffe4eab7a26be5517aab0" exitCode=137 Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.135140 4675 generic.go:334] "Generic (PLEG): container finished" podID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerID="0675bef8fe5eaffaa6c1e0c910e6915ea7c74dafe2ab8373e5ebc1df747f5a63" exitCode=137 Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.135053 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-546bd65fb7-ktfbp" event={"ID":"b282cfa4-8448-4eef-8463-fca67d9608fd","Type":"ContainerDied","Data":"0ea697f15b863561ce440e2aacebf2bd03ae127718dffe4eab7a26be5517aab0"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.135365 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-546bd65fb7-ktfbp" event={"ID":"b282cfa4-8448-4eef-8463-fca67d9608fd","Type":"ContainerDied","Data":"0675bef8fe5eaffaa6c1e0c910e6915ea7c74dafe2ab8373e5ebc1df747f5a63"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.149003 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cab423d6-a026-41a9-8ae7-7ab2339de5ef","Type":"ContainerStarted","Data":"139c4593889560d15eb79442e1e9e0e4a052ae6e2706cdb6dd3c1f19673a5d0a"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166098 4675 generic.go:334] "Generic (PLEG): container finished" podID="85c227a3-b831-440b-ab1a-4171217faf81" containerID="a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d" exitCode=137 Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166142 4675 generic.go:334] "Generic (PLEG): container finished" podID="85c227a3-b831-440b-ab1a-4171217faf81" containerID="e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87" exitCode=137 Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166295 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9b9d8b55-gbspf" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166466 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9b9d8b55-gbspf" event={"ID":"85c227a3-b831-440b-ab1a-4171217faf81","Type":"ContainerDied","Data":"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166567 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9b9d8b55-gbspf" event={"ID":"85c227a3-b831-440b-ab1a-4171217faf81","Type":"ContainerDied","Data":"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166626 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9b9d8b55-gbspf" event={"ID":"85c227a3-b831-440b-ab1a-4171217faf81","Type":"ContainerDied","Data":"e5a7403e6a3dec7b0efd0423e6523937c2605b8ecc7f623efe3a05e7d4548aea"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.166686 4675 scope.go:117] "RemoveContainer" containerID="a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.184558 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9ffbdb49-tcvn8" event={"ID":"c411bc97-1398-438f-a194-1d53f896f405","Type":"ContainerStarted","Data":"41b187d5defe69e5082be1c97d7becbc1a540d5c18f8deda820c7402d74f7f40"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.184712 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9ffbdb49-tcvn8" event={"ID":"c411bc97-1398-438f-a194-1d53f896f405","Type":"ContainerStarted","Data":"4ab73b34a04f781e39ebee32362c93bbe7d9cea766f6700e6ed8773e6704497d"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.213954 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6d08d76-57ad-48b8-98d5-cc802e2b7194","Type":"ContainerStarted","Data":"c09a9f444e1923fddf3375475305a15ea2cd2303959f75ad89ff5686871d8663"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.228998 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85c227a3-b831-440b-ab1a-4171217faf81-horizon-secret-key\") pod \"85c227a3-b831-440b-ab1a-4171217faf81\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.229085 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-scripts\") pod \"85c227a3-b831-440b-ab1a-4171217faf81\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.229215 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pl9m\" (UniqueName: \"kubernetes.io/projected/85c227a3-b831-440b-ab1a-4171217faf81-kube-api-access-6pl9m\") pod \"85c227a3-b831-440b-ab1a-4171217faf81\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.229260 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-config-data\") pod \"85c227a3-b831-440b-ab1a-4171217faf81\" (UID: \"85c227a3-b831-440b-ab1a-4171217faf81\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.229742 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c227a3-b831-440b-ab1a-4171217faf81-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.246074 4675 generic.go:334] "Generic (PLEG): container finished" podID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerID="f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7" exitCode=137 Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.246175 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf9f9b95c-kgcr5" event={"ID":"d94a2d76-92e2-4403-ad6d-e2124b400d78","Type":"ContainerDied","Data":"f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.246219 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf9f9b95c-kgcr5" event={"ID":"d94a2d76-92e2-4403-ad6d-e2124b400d78","Type":"ContainerDied","Data":"5ca707fa6b221436887a3e61b4dd7600c7645d1480ab8b43494fabbc82e9f413"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.246310 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bf9f9b95c-kgcr5" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.257850 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerStarted","Data":"0f62eb8a5b345a4ddd014d4020dba949777482e9dcb37626960de4a30fe752a6"} Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.261123 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.458843317 podStartE2EDuration="8.261105247s" podCreationTimestamp="2026-03-20 16:23:04 +0000 UTC" firstStartedPulling="2026-03-20 16:23:06.130791436 +0000 UTC m=+1306.164420973" lastFinishedPulling="2026-03-20 16:23:08.933053366 +0000 UTC m=+1308.966682903" observedRunningTime="2026-03-20 16:23:12.246640754 +0000 UTC m=+1312.280270301" watchObservedRunningTime="2026-03-20 16:23:12.261105247 +0000 UTC m=+1312.294734784" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.266265 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c227a3-b831-440b-ab1a-4171217faf81-kube-api-access-6pl9m" (OuterVolumeSpecName: "kube-api-access-6pl9m") pod "85c227a3-b831-440b-ab1a-4171217faf81" (UID: "85c227a3-b831-440b-ab1a-4171217faf81"). InnerVolumeSpecName "kube-api-access-6pl9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.267717 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c227a3-b831-440b-ab1a-4171217faf81-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "85c227a3-b831-440b-ab1a-4171217faf81" (UID: "85c227a3-b831-440b-ab1a-4171217faf81"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.288412 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-config-data" (OuterVolumeSpecName: "config-data") pod "85c227a3-b831-440b-ab1a-4171217faf81" (UID: "85c227a3-b831-440b-ab1a-4171217faf81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.294580 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.299135 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-scripts" (OuterVolumeSpecName: "scripts") pod "85c227a3-b831-440b-ab1a-4171217faf81" (UID: "85c227a3-b831-440b-ab1a-4171217faf81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.331005 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pl9m\" (UniqueName: \"kubernetes.io/projected/85c227a3-b831-440b-ab1a-4171217faf81-kube-api-access-6pl9m\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.331031 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.331040 4675 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85c227a3-b831-440b-ab1a-4171217faf81-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.331052 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85c227a3-b831-440b-ab1a-4171217faf81-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.394831 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bf9f9b95c-kgcr5"] Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.406359 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5bf9f9b95c-kgcr5"] Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.432215 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b282cfa4-8448-4eef-8463-fca67d9608fd-logs\") pod \"b282cfa4-8448-4eef-8463-fca67d9608fd\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.432292 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs5lz\" (UniqueName: \"kubernetes.io/projected/b282cfa4-8448-4eef-8463-fca67d9608fd-kube-api-access-hs5lz\") pod \"b282cfa4-8448-4eef-8463-fca67d9608fd\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.432391 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b282cfa4-8448-4eef-8463-fca67d9608fd-horizon-secret-key\") pod \"b282cfa4-8448-4eef-8463-fca67d9608fd\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.432538 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-scripts\") pod \"b282cfa4-8448-4eef-8463-fca67d9608fd\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.432556 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-config-data\") pod \"b282cfa4-8448-4eef-8463-fca67d9608fd\" (UID: \"b282cfa4-8448-4eef-8463-fca67d9608fd\") " Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.433446 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b282cfa4-8448-4eef-8463-fca67d9608fd-logs" (OuterVolumeSpecName: "logs") pod "b282cfa4-8448-4eef-8463-fca67d9608fd" (UID: "b282cfa4-8448-4eef-8463-fca67d9608fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.443918 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b282cfa4-8448-4eef-8463-fca67d9608fd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.457796 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b282cfa4-8448-4eef-8463-fca67d9608fd-kube-api-access-hs5lz" (OuterVolumeSpecName: "kube-api-access-hs5lz") pod "b282cfa4-8448-4eef-8463-fca67d9608fd" (UID: "b282cfa4-8448-4eef-8463-fca67d9608fd"). InnerVolumeSpecName "kube-api-access-hs5lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.459495 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b282cfa4-8448-4eef-8463-fca67d9608fd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b282cfa4-8448-4eef-8463-fca67d9608fd" (UID: "b282cfa4-8448-4eef-8463-fca67d9608fd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.513470 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-config-data" (OuterVolumeSpecName: "config-data") pod "b282cfa4-8448-4eef-8463-fca67d9608fd" (UID: "b282cfa4-8448-4eef-8463-fca67d9608fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.547481 4675 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b282cfa4-8448-4eef-8463-fca67d9608fd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.547520 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.547532 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs5lz\" (UniqueName: \"kubernetes.io/projected/b282cfa4-8448-4eef-8463-fca67d9608fd-kube-api-access-hs5lz\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.548053 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-scripts" (OuterVolumeSpecName: "scripts") pod "b282cfa4-8448-4eef-8463-fca67d9608fd" (UID: "b282cfa4-8448-4eef-8463-fca67d9608fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.649177 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b282cfa4-8448-4eef-8463-fca67d9608fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.702253 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" path="/var/lib/kubelet/pods/d94a2d76-92e2-4403-ad6d-e2124b400d78/volumes" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.703014 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b9b9d8b55-gbspf"] Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.706168 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b9b9d8b55-gbspf"] Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.763756 4675 scope.go:117] "RemoveContainer" containerID="e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.864724 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d6cd9476b-xwbkm"] Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.874973 4675 scope.go:117] "RemoveContainer" containerID="a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d" Mar 20 16:23:12 crc kubenswrapper[4675]: E0320 16:23:12.875403 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d\": container with ID starting with a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d not found: ID does not exist" containerID="a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.875436 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d"} err="failed to get container status \"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d\": rpc error: code = NotFound desc = could not find container \"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d\": container with ID starting with a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d not found: ID does not exist" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.875467 4675 scope.go:117] "RemoveContainer" containerID="e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87" Mar 20 16:23:12 crc kubenswrapper[4675]: E0320 16:23:12.875686 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87\": container with ID starting with e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87 not found: ID does not exist" containerID="e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.875714 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87"} err="failed to get container status \"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87\": rpc error: code = NotFound desc = could not find container \"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87\": container with ID starting with e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87 not found: ID does not exist" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.875731 4675 scope.go:117] "RemoveContainer" containerID="a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.876889 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d"} err="failed to get container status \"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d\": rpc error: code = NotFound desc = could not find container \"a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d\": container with ID starting with a43047dc5328043e3b8fb97fe858931e1c672340395e6d733dbb31d15b4df17d not found: ID does not exist" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.876911 4675 scope.go:117] "RemoveContainer" containerID="e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.877345 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87"} err="failed to get container status \"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87\": rpc error: code = NotFound desc = could not find container \"e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87\": container with ID starting with e154ed2b38a4be68e9485f753ac5742871fe878300759600275511f77fc49a87 not found: ID does not exist" Mar 20 16:23:12 crc kubenswrapper[4675]: I0320 16:23:12.877364 4675 scope.go:117] "RemoveContainer" containerID="f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.074681 4675 scope.go:117] "RemoveContainer" containerID="91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.125836 4675 scope.go:117] "RemoveContainer" containerID="f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7" Mar 20 16:23:13 crc kubenswrapper[4675]: E0320 16:23:13.128112 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7\": container with ID starting with f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7 not found: ID does not exist" containerID="f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.128156 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7"} err="failed to get container status \"f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7\": rpc error: code = NotFound desc = could not find container \"f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7\": container with ID starting with f64a3fc2d65180f45d7b76db2f48b7d286a97c488208efb94bc198852ef433c7 not found: ID does not exist" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.128185 4675 scope.go:117] "RemoveContainer" containerID="91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817" Mar 20 16:23:13 crc kubenswrapper[4675]: E0320 16:23:13.130567 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817\": container with ID starting with 91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817 not found: ID does not exist" containerID="91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.130599 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817"} err="failed to get container status \"91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817\": rpc error: code = NotFound desc = could not find container \"91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817\": container with ID starting with 91977a921f97fd88168637ee9002a7dba9ce9d908871781ce93aff2da0b4d817 not found: ID does not exist" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.279864 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerStarted","Data":"0c8a37fb85c90d253f250f0faeaa438aa9eb85865730c5d201ceb86047b799f4"} Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.290074 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-546bd65fb7-ktfbp" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.290080 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-546bd65fb7-ktfbp" event={"ID":"b282cfa4-8448-4eef-8463-fca67d9608fd","Type":"ContainerDied","Data":"87cbee3a0af82bfda61fe02012097ad1957713acb71b9c41eb26f72620b31922"} Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.290136 4675 scope.go:117] "RemoveContainer" containerID="0ea697f15b863561ce440e2aacebf2bd03ae127718dffe4eab7a26be5517aab0" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.294041 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cab423d6-a026-41a9-8ae7-7ab2339de5ef","Type":"ContainerStarted","Data":"4fc502fed2b2ed50fd9b0457cd6ba4a9b1dfbc43459b046d7383ded7cb843c60"} Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.296941 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9ffbdb49-tcvn8" event={"ID":"c411bc97-1398-438f-a194-1d53f896f405","Type":"ContainerStarted","Data":"cc9708da5a7c8616145fe9940d61cd4dcad01b3d61d715717c22c76a74ce7fc9"} Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.297024 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.345174 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d6cd9476b-xwbkm" event={"ID":"32e344b5-b713-4104-ac00-0793fd3e94d9","Type":"ContainerStarted","Data":"55fa89b25b0d794f7b5ac79b538b6358129d04f4adb6a5ac7c78f68e6ed8a516"} Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.345212 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d6cd9476b-xwbkm" event={"ID":"32e344b5-b713-4104-ac00-0793fd3e94d9","Type":"ContainerStarted","Data":"b5ab2d888f55ead5a565dba8cba62658a36491b24fbbf552b2b25b1674068de8"} Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.390428 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f9ffbdb49-tcvn8" podStartSLOduration=4.390404849 podStartE2EDuration="4.390404849s" podCreationTimestamp="2026-03-20 16:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:13.353245645 +0000 UTC m=+1313.386875192" watchObservedRunningTime="2026-03-20 16:23:13.390404849 +0000 UTC m=+1313.424034386" Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.441827 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-546bd65fb7-ktfbp"] Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.461385 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-546bd65fb7-ktfbp"] Mar 20 16:23:13 crc kubenswrapper[4675]: I0320 16:23:13.567203 4675 scope.go:117] "RemoveContainer" containerID="0675bef8fe5eaffaa6c1e0c910e6915ea7c74dafe2ab8373e5ebc1df747f5a63" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.035562 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d79d4db6d-vnw9g" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.076815 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.123177 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c454cc68b-lmjfb"] Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.368684 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cab423d6-a026-41a9-8ae7-7ab2339de5ef","Type":"ContainerStarted","Data":"cccea74b0bbf7333531ae3078b802d8b7efdfcc42de7aa497f245e681bc4f70b"} Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.370124 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.377500 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d6cd9476b-xwbkm" event={"ID":"32e344b5-b713-4104-ac00-0793fd3e94d9","Type":"ContainerStarted","Data":"138618fa527cebf984e2c543f482cc205bac86bb7ec6830d022e1a98e3810346"} Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.377832 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.377956 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.384071 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerStarted","Data":"09f6434a0f6c781f5df63682f45522e7ecf1b0341fc853292c553ceeed72b65a"} Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.384307 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c454cc68b-lmjfb" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon-log" containerID="cri-o://6186dbac985c54733a1640322c538c03ab0fe9a306fd411d2f745642ee54de5f" gracePeriod=30 Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.384465 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c454cc68b-lmjfb" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" containerID="cri-o://94241f316e66ac5988644a931eae1198f8d67b4991a0efd911a7d74bc29be0f7" gracePeriod=30 Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.410378 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.410357258 podStartE2EDuration="4.410357258s" podCreationTimestamp="2026-03-20 16:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:14.401221494 +0000 UTC m=+1314.434851031" watchObservedRunningTime="2026-03-20 16:23:14.410357258 +0000 UTC m=+1314.443986795" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.447056 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d6cd9476b-xwbkm" podStartSLOduration=3.447030698 podStartE2EDuration="3.447030698s" podCreationTimestamp="2026-03-20 16:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:14.435399845 +0000 UTC m=+1314.469029392" watchObservedRunningTime="2026-03-20 16:23:14.447030698 +0000 UTC m=+1314.480660245" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.689648 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c227a3-b831-440b-ab1a-4171217faf81" path="/var/lib/kubelet/pods/85c227a3-b831-440b-ab1a-4171217faf81/volumes" Mar 20 16:23:14 crc kubenswrapper[4675]: I0320 16:23:14.690480 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" path="/var/lib/kubelet/pods/b282cfa4-8448-4eef-8463-fca67d9608fd/volumes" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.223543 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.227915 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.394448 4675 generic.go:334] "Generic (PLEG): container finished" podID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerID="e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5" exitCode=0 Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.394529 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6db44d49-fvfv9" event={"ID":"e7c2420c-dca0-41c9-8f28-79a5337c7444","Type":"ContainerDied","Data":"e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5"} Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.394567 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b6db44d49-fvfv9" event={"ID":"e7c2420c-dca0-41c9-8f28-79a5337c7444","Type":"ContainerDied","Data":"f8e1a0252f802d37a70ca8af7f07735d8d82bb1f171df95f6806b394b8e1c101"} Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.394591 4675 scope.go:117] "RemoveContainer" containerID="43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.394717 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b6db44d49-fvfv9" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.398926 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerStarted","Data":"dfa8caf51455d0e2e66fb19d48eff9ebc6c39b3ab4ffa7f536da7e1dee06a6df"} Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.418401 4675 scope.go:117] "RemoveContainer" containerID="e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.429501 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-combined-ca-bundle\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.429550 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-public-tls-certs\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.429705 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-httpd-config\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.429729 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-internal-tls-certs\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.429877 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-config\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.429935 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkc5k\" (UniqueName: \"kubernetes.io/projected/e7c2420c-dca0-41c9-8f28-79a5337c7444-kube-api-access-xkc5k\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.430040 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-ovndb-tls-certs\") pod \"e7c2420c-dca0-41c9-8f28-79a5337c7444\" (UID: \"e7c2420c-dca0-41c9-8f28-79a5337c7444\") " Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.436376 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c2420c-dca0-41c9-8f28-79a5337c7444-kube-api-access-xkc5k" (OuterVolumeSpecName: "kube-api-access-xkc5k") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "kube-api-access-xkc5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.436730 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.448413 4675 scope.go:117] "RemoveContainer" containerID="43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1" Mar 20 16:23:15 crc kubenswrapper[4675]: E0320 16:23:15.448867 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1\": container with ID starting with 43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1 not found: ID does not exist" containerID="43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.448899 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1"} err="failed to get container status \"43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1\": rpc error: code = NotFound desc = could not find container \"43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1\": container with ID starting with 43011c4f87746c830296462d0daa0a96bc53537eaa760082003bb4ad197164c1 not found: ID does not exist" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.448922 4675 scope.go:117] "RemoveContainer" containerID="e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5" Mar 20 16:23:15 crc kubenswrapper[4675]: E0320 16:23:15.449123 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5\": container with ID starting with e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5 not found: ID does not exist" containerID="e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.449145 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5"} err="failed to get container status \"e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5\": rpc error: code = NotFound desc = could not find container \"e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5\": container with ID starting with e2aaaafc87348950030b8413d2b9a8a56c87197861ef9d13b6bc85918a470eb5 not found: ID does not exist" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.470040 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.502406 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.510421 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-config" (OuterVolumeSpecName: "config") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.521550 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.522085 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.530404 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.532671 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.532881 4675 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.532945 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.533007 4675 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.533064 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.533115 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkc5k\" (UniqueName: \"kubernetes.io/projected/e7c2420c-dca0-41c9-8f28-79a5337c7444-kube-api-access-xkc5k\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.552581 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e7c2420c-dca0-41c9-8f28-79a5337c7444" (UID: "e7c2420c-dca0-41c9-8f28-79a5337c7444"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.635386 4675 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c2420c-dca0-41c9-8f28-79a5337c7444-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.728745 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b6db44d49-fvfv9"] Mar 20 16:23:15 crc kubenswrapper[4675]: I0320 16:23:15.739308 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b6db44d49-fvfv9"] Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.110295 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.186731 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-xxh7q"] Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.187037 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerName="dnsmasq-dns" containerID="cri-o://b98ea38e522301d1725bf630c264a8f413e300bd1d35097ce7fa4916c2baddc0" gracePeriod=10 Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.427462 4675 generic.go:334] "Generic (PLEG): container finished" podID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerID="b98ea38e522301d1725bf630c264a8f413e300bd1d35097ce7fa4916c2baddc0" exitCode=0 Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.427890 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" event={"ID":"6038548b-5443-43f8-adec-c61ed20a6c2f","Type":"ContainerDied","Data":"b98ea38e522301d1725bf630c264a8f413e300bd1d35097ce7fa4916c2baddc0"} Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.433487 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="cinder-scheduler" containerID="cri-o://df748cc02568643b3a9cce8a55dbb5a468996406070e638b5283f7c5472bd336" gracePeriod=30 Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.434805 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="probe" containerID="cri-o://c09a9f444e1923fddf3375475305a15ea2cd2303959f75ad89ff5686871d8663" gracePeriod=30 Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.701836 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" path="/var/lib/kubelet/pods/e7c2420c-dca0-41c9-8f28-79a5337c7444/volumes" Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.766984 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.964548 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-sb\") pod \"6038548b-5443-43f8-adec-c61ed20a6c2f\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.965019 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-config\") pod \"6038548b-5443-43f8-adec-c61ed20a6c2f\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.965063 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-swift-storage-0\") pod \"6038548b-5443-43f8-adec-c61ed20a6c2f\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.965091 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-nb\") pod \"6038548b-5443-43f8-adec-c61ed20a6c2f\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.965155 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8ztp\" (UniqueName: \"kubernetes.io/projected/6038548b-5443-43f8-adec-c61ed20a6c2f-kube-api-access-t8ztp\") pod \"6038548b-5443-43f8-adec-c61ed20a6c2f\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.965197 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-svc\") pod \"6038548b-5443-43f8-adec-c61ed20a6c2f\" (UID: \"6038548b-5443-43f8-adec-c61ed20a6c2f\") " Mar 20 16:23:16 crc kubenswrapper[4675]: I0320 16:23:16.983171 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6038548b-5443-43f8-adec-c61ed20a6c2f-kube-api-access-t8ztp" (OuterVolumeSpecName: "kube-api-access-t8ztp") pod "6038548b-5443-43f8-adec-c61ed20a6c2f" (UID: "6038548b-5443-43f8-adec-c61ed20a6c2f"). InnerVolumeSpecName "kube-api-access-t8ztp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.038648 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6038548b-5443-43f8-adec-c61ed20a6c2f" (UID: "6038548b-5443-43f8-adec-c61ed20a6c2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.059314 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6038548b-5443-43f8-adec-c61ed20a6c2f" (UID: "6038548b-5443-43f8-adec-c61ed20a6c2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.080242 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.080287 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8ztp\" (UniqueName: \"kubernetes.io/projected/6038548b-5443-43f8-adec-c61ed20a6c2f-kube-api-access-t8ztp\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.080301 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.130237 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-config" (OuterVolumeSpecName: "config") pod "6038548b-5443-43f8-adec-c61ed20a6c2f" (UID: "6038548b-5443-43f8-adec-c61ed20a6c2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.130343 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6038548b-5443-43f8-adec-c61ed20a6c2f" (UID: "6038548b-5443-43f8-adec-c61ed20a6c2f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.139251 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6038548b-5443-43f8-adec-c61ed20a6c2f" (UID: "6038548b-5443-43f8-adec-c61ed20a6c2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.182548 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.182803 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.182881 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6038548b-5443-43f8-adec-c61ed20a6c2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.445066 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerStarted","Data":"cb653121160ab646b20f0ee7eb09c16b50d77a374e91186869db35bbefe627ff"} Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.445228 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.446981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" event={"ID":"6038548b-5443-43f8-adec-c61ed20a6c2f","Type":"ContainerDied","Data":"3058832c4d856ee34dc81f9cb4bc74e6bdfc4bb4eeb3935bb83216bd6e595525"} Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.447006 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-xxh7q" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.447019 4675 scope.go:117] "RemoveContainer" containerID="b98ea38e522301d1725bf630c264a8f413e300bd1d35097ce7fa4916c2baddc0" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.449101 4675 generic.go:334] "Generic (PLEG): container finished" podID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerID="c09a9f444e1923fddf3375475305a15ea2cd2303959f75ad89ff5686871d8663" exitCode=0 Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.449149 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6d08d76-57ad-48b8-98d5-cc802e2b7194","Type":"ContainerDied","Data":"c09a9f444e1923fddf3375475305a15ea2cd2303959f75ad89ff5686871d8663"} Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.465086 4675 scope.go:117] "RemoveContainer" containerID="dec817acbd187c5374d806e20a1e31335626743dd9e3d6d16cdbab7a46c8fea5" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.500334 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3292932029999998 podStartE2EDuration="7.500315523s" podCreationTimestamp="2026-03-20 16:23:10 +0000 UTC" firstStartedPulling="2026-03-20 16:23:11.382237833 +0000 UTC m=+1311.415867370" lastFinishedPulling="2026-03-20 16:23:16.553260153 +0000 UTC m=+1316.586889690" observedRunningTime="2026-03-20 16:23:17.481396017 +0000 UTC m=+1317.515025574" watchObservedRunningTime="2026-03-20 16:23:17.500315523 +0000 UTC m=+1317.533945050" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.509454 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.509897 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-xxh7q"] Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.517454 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-xxh7q"] Mar 20 16:23:17 crc kubenswrapper[4675]: I0320 16:23:17.817117 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:18 crc kubenswrapper[4675]: I0320 16:23:18.468874 4675 generic.go:334] "Generic (PLEG): container finished" podID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerID="94241f316e66ac5988644a931eae1198f8d67b4991a0efd911a7d74bc29be0f7" exitCode=0 Mar 20 16:23:18 crc kubenswrapper[4675]: I0320 16:23:18.469750 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c454cc68b-lmjfb" event={"ID":"3e972387-c641-42bd-9c3f-69fc70869c8a","Type":"ContainerDied","Data":"94241f316e66ac5988644a931eae1198f8d67b4991a0efd911a7d74bc29be0f7"} Mar 20 16:23:18 crc kubenswrapper[4675]: I0320 16:23:18.693797 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" path="/var/lib/kubelet/pods/6038548b-5443-43f8-adec-c61ed20a6c2f/volumes" Mar 20 16:23:18 crc kubenswrapper[4675]: I0320 16:23:18.707363 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c454cc68b-lmjfb" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 16:23:18 crc kubenswrapper[4675]: E0320 16:23:18.732996 4675 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d08d76_57ad_48b8_98d5_cc802e2b7194.slice/crio-conmon-df748cc02568643b3a9cce8a55dbb5a468996406070e638b5283f7c5472bd336.scope\": RecentStats: unable to find data in memory cache]" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.173172 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.173501 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.563336 4675 generic.go:334] "Generic (PLEG): container finished" podID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerID="df748cc02568643b3a9cce8a55dbb5a468996406070e638b5283f7c5472bd336" exitCode=0 Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.563643 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6d08d76-57ad-48b8-98d5-cc802e2b7194","Type":"ContainerDied","Data":"df748cc02568643b3a9cce8a55dbb5a468996406070e638b5283f7c5472bd336"} Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.563681 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6d08d76-57ad-48b8-98d5-cc802e2b7194","Type":"ContainerDied","Data":"5d09e7fafedf66bb977fd06abd5dea02e3e4afb100090b50b6f5c8c5d511f56f"} Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.563696 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d09e7fafedf66bb977fd06abd5dea02e3e4afb100090b50b6f5c8c5d511f56f" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.579595 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641028 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjw6\" (UniqueName: \"kubernetes.io/projected/b6d08d76-57ad-48b8-98d5-cc802e2b7194-kube-api-access-ffjw6\") pod \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641114 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data\") pod \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641219 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6d08d76-57ad-48b8-98d5-cc802e2b7194-etc-machine-id\") pod \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641255 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-combined-ca-bundle\") pod \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641275 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data-custom\") pod \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641376 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-scripts\") pod \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\" (UID: \"b6d08d76-57ad-48b8-98d5-cc802e2b7194\") " Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.641895 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6d08d76-57ad-48b8-98d5-cc802e2b7194-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b6d08d76-57ad-48b8-98d5-cc802e2b7194" (UID: "b6d08d76-57ad-48b8-98d5-cc802e2b7194"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.695738 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-scripts" (OuterVolumeSpecName: "scripts") pod "b6d08d76-57ad-48b8-98d5-cc802e2b7194" (UID: "b6d08d76-57ad-48b8-98d5-cc802e2b7194"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.697687 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d08d76-57ad-48b8-98d5-cc802e2b7194-kube-api-access-ffjw6" (OuterVolumeSpecName: "kube-api-access-ffjw6") pod "b6d08d76-57ad-48b8-98d5-cc802e2b7194" (UID: "b6d08d76-57ad-48b8-98d5-cc802e2b7194"). InnerVolumeSpecName "kube-api-access-ffjw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.728271 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6d08d76-57ad-48b8-98d5-cc802e2b7194" (UID: "b6d08d76-57ad-48b8-98d5-cc802e2b7194"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.743816 4675 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6d08d76-57ad-48b8-98d5-cc802e2b7194-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.743847 4675 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.743860 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.743869 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjw6\" (UniqueName: \"kubernetes.io/projected/b6d08d76-57ad-48b8-98d5-cc802e2b7194-kube-api-access-ffjw6\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.776977 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6d08d76-57ad-48b8-98d5-cc802e2b7194" (UID: "b6d08d76-57ad-48b8-98d5-cc802e2b7194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.855720 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.881917 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data" (OuterVolumeSpecName: "config-data") pod "b6d08d76-57ad-48b8-98d5-cc802e2b7194" (UID: "b6d08d76-57ad-48b8-98d5-cc802e2b7194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:19 crc kubenswrapper[4675]: I0320 16:23:19.957092 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6d08d76-57ad-48b8-98d5-cc802e2b7194-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.414350 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5f7ccf99c9-m6s8x" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.570437 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.614621 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.621422 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.641934 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642291 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642325 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642354 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642363 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642376 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="cinder-scheduler" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642384 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="cinder-scheduler" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642391 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-api" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642399 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-api" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642414 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642421 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642432 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="probe" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642438 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="probe" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642449 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-httpd" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642456 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-httpd" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642467 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642476 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642485 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642492 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642508 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerName="init" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642514 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerName="init" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642526 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642533 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: E0320 16:23:20.642541 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerName="dnsmasq-dns" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642547 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerName="dnsmasq-dns" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642750 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-api" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642781 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="probe" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642792 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6038548b-5443-43f8-adec-c61ed20a6c2f" containerName="dnsmasq-dns" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642804 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642812 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642825 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94a2d76-92e2-4403-ad6d-e2124b400d78" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642835 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b282cfa4-8448-4eef-8463-fca67d9608fd" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642848 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon-log" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642864 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c2420c-dca0-41c9-8f28-79a5337c7444" containerName="neutron-httpd" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642875 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" containerName="cinder-scheduler" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.642887 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c227a3-b831-440b-ab1a-4171217faf81" containerName="horizon" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.643966 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.647076 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.656002 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.670424 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.670486 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw458\" (UniqueName: \"kubernetes.io/projected/feac59ee-65ab-4f54-a829-fdea75fd800b-kube-api-access-nw458\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.670536 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-scripts\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.670567 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feac59ee-65ab-4f54-a829-fdea75fd800b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.670603 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.670628 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-config-data\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.700633 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d08d76-57ad-48b8-98d5-cc802e2b7194" path="/var/lib/kubelet/pods/b6d08d76-57ad-48b8-98d5-cc802e2b7194/volumes" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.773216 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-config-data\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.773604 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.773697 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw458\" (UniqueName: \"kubernetes.io/projected/feac59ee-65ab-4f54-a829-fdea75fd800b-kube-api-access-nw458\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.773810 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-scripts\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.773921 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feac59ee-65ab-4f54-a829-fdea75fd800b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.774185 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.776888 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/feac59ee-65ab-4f54-a829-fdea75fd800b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.779554 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.791467 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.794679 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-scripts\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.800332 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw458\" (UniqueName: \"kubernetes.io/projected/feac59ee-65ab-4f54-a829-fdea75fd800b-kube-api-access-nw458\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.801396 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feac59ee-65ab-4f54-a829-fdea75fd800b-config-data\") pod \"cinder-scheduler-0\" (UID: \"feac59ee-65ab-4f54-a829-fdea75fd800b\") " pod="openstack/cinder-scheduler-0" Mar 20 16:23:20 crc kubenswrapper[4675]: I0320 16:23:20.964009 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.413890 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.488798 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.527573 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-cf47f4dbd-zbrxj" Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.596443 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"feac59ee-65ab-4f54-a829-fdea75fd800b","Type":"ContainerStarted","Data":"4842bb0603329536b7b785e616e057470f56b2e3c1655e2299fcbb72eb495b71"} Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.618433 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-678454f7d4-qg8vh"] Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.618692 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-678454f7d4-qg8vh" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-log" containerID="cri-o://36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7" gracePeriod=30 Mar 20 16:23:21 crc kubenswrapper[4675]: I0320 16:23:21.621830 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-678454f7d4-qg8vh" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-api" containerID="cri-o://6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7" gracePeriod=30 Mar 20 16:23:22 crc kubenswrapper[4675]: I0320 16:23:22.611811 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"feac59ee-65ab-4f54-a829-fdea75fd800b","Type":"ContainerStarted","Data":"3746086e85f5f098d49aa7bf6239c42c3e2b394a9ca2ab23b35c41c54329e6bb"} Mar 20 16:23:22 crc kubenswrapper[4675]: I0320 16:23:22.615629 4675 generic.go:334] "Generic (PLEG): container finished" podID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerID="36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7" exitCode=143 Mar 20 16:23:22 crc kubenswrapper[4675]: I0320 16:23:22.615673 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678454f7d4-qg8vh" event={"ID":"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c","Type":"ContainerDied","Data":"36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7"} Mar 20 16:23:23 crc kubenswrapper[4675]: I0320 16:23:23.628000 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"feac59ee-65ab-4f54-a829-fdea75fd800b","Type":"ContainerStarted","Data":"172d727d92573198f6985b0d558626c8bd57ef421b4c48ffdf281c85e39db08c"} Mar 20 16:23:23 crc kubenswrapper[4675]: I0320 16:23:23.677054 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.677036607 podStartE2EDuration="3.677036607s" podCreationTimestamp="2026-03-20 16:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:23.669527068 +0000 UTC m=+1323.703156625" watchObservedRunningTime="2026-03-20 16:23:23.677036607 +0000 UTC m=+1323.710666144" Mar 20 16:23:23 crc kubenswrapper[4675]: I0320 16:23:23.732382 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 16:23:23 crc kubenswrapper[4675]: I0320 16:23:23.958935 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.313006 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.388639 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-564d9f6b4c-5p6js"] Mar 20 16:23:25 crc kubenswrapper[4675]: E0320 16:23:25.389019 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-log" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.389036 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-log" Mar 20 16:23:25 crc kubenswrapper[4675]: E0320 16:23:25.389053 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-api" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.389061 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-api" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.389271 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-api" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.389300 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerName="placement-log" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.390268 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392442 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-internal-tls-certs\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392537 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4vdk\" (UniqueName: \"kubernetes.io/projected/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-kube-api-access-v4vdk\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392568 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-logs\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392630 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-config-data\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392672 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-scripts\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392713 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-public-tls-certs\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.392854 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-combined-ca-bundle\") pod \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\" (UID: \"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c\") " Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.393625 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-logs" (OuterVolumeSpecName: "logs") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.394338 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.394616 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.397671 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.410234 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-kube-api-access-v4vdk" (OuterVolumeSpecName: "kube-api-access-v4vdk") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "kube-api-access-v4vdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.410311 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-564d9f6b4c-5p6js"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.423888 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-scripts" (OuterVolumeSpecName: "scripts") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.486465 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.494979 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7zh\" (UniqueName: \"kubernetes.io/projected/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-kube-api-access-vn7zh\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495042 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-run-httpd\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495075 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-log-httpd\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495100 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-combined-ca-bundle\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495117 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-internal-tls-certs\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495165 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-etc-swift\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495319 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-public-tls-certs\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495386 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-config-data\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495706 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4vdk\" (UniqueName: \"kubernetes.io/projected/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-kube-api-access-v4vdk\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495726 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.495740 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.498303 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.498529 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.507561 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.508203 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.511591 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tb277" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.571598 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-config-data" (OuterVolumeSpecName: "config-data") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.575211 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.596946 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-run-httpd\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.596997 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-log-httpd\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597019 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597024 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597082 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-combined-ca-bundle\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597105 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-internal-tls-certs\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597142 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597172 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-etc-swift\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597200 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597236 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-public-tls-certs\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597283 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-config-data\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597334 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmnb\" (UniqueName: \"kubernetes.io/projected/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-kube-api-access-2cmnb\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597370 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7zh\" (UniqueName: \"kubernetes.io/projected/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-kube-api-access-vn7zh\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597409 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-run-httpd\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597417 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597427 4675 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597439 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.597624 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-log-httpd\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.601711 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-etc-swift\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.602313 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-public-tls-certs\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.603482 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-internal-tls-certs\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.603736 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-config-data\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.604691 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-combined-ca-bundle\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.616436 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7zh\" (UniqueName: \"kubernetes.io/projected/28cb2c9b-231d-473b-bbd2-ba4be8c6787e-kube-api-access-vn7zh\") pod \"swift-proxy-564d9f6b4c-5p6js\" (UID: \"28cb2c9b-231d-473b-bbd2-ba4be8c6787e\") " pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.632049 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" (UID: "ed6cac72-fb1c-4f4f-8b2a-924e6647e12c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.653323 4675 generic.go:334] "Generic (PLEG): container finished" podID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" containerID="6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7" exitCode=0 Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.653372 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678454f7d4-qg8vh" event={"ID":"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c","Type":"ContainerDied","Data":"6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7"} Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.653405 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678454f7d4-qg8vh" event={"ID":"ed6cac72-fb1c-4f4f-8b2a-924e6647e12c","Type":"ContainerDied","Data":"7d10e5e5940feb7328a72f0d18a49eec7df7eac2a095c1feb308d783b2bb73f1"} Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.653428 4675 scope.go:117] "RemoveContainer" containerID="6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.653589 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-678454f7d4-qg8vh" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.682460 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d6cd9476b-xwbkm" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.698993 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmnb\" (UniqueName: \"kubernetes.io/projected/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-kube-api-access-2cmnb\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.699066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.699116 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.699155 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.700286 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.701240 4675 scope.go:117] "RemoveContainer" containerID="36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.703340 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.703968 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-678454f7d4-qg8vh"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.704133 4675 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.705388 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.713822 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-678454f7d4-qg8vh"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.727891 4675 scope.go:117] "RemoveContainer" containerID="6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.729474 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmnb\" (UniqueName: \"kubernetes.io/projected/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-kube-api-access-2cmnb\") pod \"openstackclient\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: E0320 16:23:25.733397 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7\": container with ID starting with 6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7 not found: ID does not exist" containerID="6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.733459 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7"} err="failed to get container status \"6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7\": rpc error: code = NotFound desc = could not find container \"6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7\": container with ID starting with 6cfc32499b4d04bb1020a05e43c0e2e40c680f5e207c9d5c52a51d840dd90bb7 not found: ID does not exist" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.733491 4675 scope.go:117] "RemoveContainer" containerID="36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7" Mar 20 16:23:25 crc kubenswrapper[4675]: E0320 16:23:25.736860 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7\": container with ID starting with 36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7 not found: ID does not exist" containerID="36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.736887 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7"} err="failed to get container status \"36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7\": rpc error: code = NotFound desc = could not find container \"36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7\": container with ID starting with 36c054f87c268c8927621b9d1425630db39812690b64c5c697f40db420a24ca7 not found: ID does not exist" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.789920 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7568b66c46-pphb7"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.790224 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7568b66c46-pphb7" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api-log" containerID="cri-o://e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975" gracePeriod=30 Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.790395 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7568b66c46-pphb7" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api" containerID="cri-o://05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a" gracePeriod=30 Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.830415 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.831297 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.831735 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.858200 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.868690 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.870262 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.888975 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.909946 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/226f3568-6214-4345-9991-3bda09594c67-openstack-config-secret\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.910047 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/226f3568-6214-4345-9991-3bda09594c67-openstack-config\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.910108 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226f3568-6214-4345-9991-3bda09594c67-combined-ca-bundle\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.910255 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9z9t\" (UniqueName: \"kubernetes.io/projected/226f3568-6214-4345-9991-3bda09594c67-kube-api-access-b9z9t\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:25 crc kubenswrapper[4675]: I0320 16:23:25.966935 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.012606 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/226f3568-6214-4345-9991-3bda09594c67-openstack-config-secret\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.012860 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/226f3568-6214-4345-9991-3bda09594c67-openstack-config\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.013008 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226f3568-6214-4345-9991-3bda09594c67-combined-ca-bundle\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.013257 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9z9t\" (UniqueName: \"kubernetes.io/projected/226f3568-6214-4345-9991-3bda09594c67-kube-api-access-b9z9t\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.017548 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/226f3568-6214-4345-9991-3bda09594c67-openstack-config\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.020897 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/226f3568-6214-4345-9991-3bda09594c67-openstack-config-secret\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.045651 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9z9t\" (UniqueName: \"kubernetes.io/projected/226f3568-6214-4345-9991-3bda09594c67-kube-api-access-b9z9t\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.045807 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226f3568-6214-4345-9991-3bda09594c67-combined-ca-bundle\") pod \"openstackclient\" (UID: \"226f3568-6214-4345-9991-3bda09594c67\") " pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.140989 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: E0320 16:23:26.156504 4675 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 16:23:26 crc kubenswrapper[4675]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_b24a4e1b-5d34-469f-9a24-6cdd95d93bf7_0(52008ca45cb0d56a5f0c0701e376f1d917b849538e74a0acae731ab69c40546e): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"52008ca45cb0d56a5f0c0701e376f1d917b849538e74a0acae731ab69c40546e" Netns:"/var/run/netns/4d2d5895-d30c-4ccb-a255-3486867ab691" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=52008ca45cb0d56a5f0c0701e376f1d917b849538e74a0acae731ab69c40546e;K8S_POD_UID=b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7]: expected pod UID "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" but got "226f3568-6214-4345-9991-3bda09594c67" from Kube API Mar 20 16:23:26 crc kubenswrapper[4675]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 16:23:26 crc kubenswrapper[4675]: > Mar 20 16:23:26 crc kubenswrapper[4675]: E0320 16:23:26.156567 4675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 16:23:26 crc kubenswrapper[4675]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_b24a4e1b-5d34-469f-9a24-6cdd95d93bf7_0(52008ca45cb0d56a5f0c0701e376f1d917b849538e74a0acae731ab69c40546e): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"52008ca45cb0d56a5f0c0701e376f1d917b849538e74a0acae731ab69c40546e" Netns:"/var/run/netns/4d2d5895-d30c-4ccb-a255-3486867ab691" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=52008ca45cb0d56a5f0c0701e376f1d917b849538e74a0acae731ab69c40546e;K8S_POD_UID=b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7]: expected pod UID "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" but got "226f3568-6214-4345-9991-3bda09594c67" from Kube API Mar 20 16:23:26 crc kubenswrapper[4675]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 16:23:26 crc kubenswrapper[4675]: > pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.571556 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-564d9f6b4c-5p6js"] Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.645334 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.672829 4675 generic.go:334] "Generic (PLEG): container finished" podID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerID="e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975" exitCode=143 Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.679273 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.683293 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" podUID="226f3568-6214-4345-9991-3bda09594c67" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.697277 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6cac72-fb1c-4f4f-8b2a-924e6647e12c" path="/var/lib/kubelet/pods/ed6cac72-fb1c-4f4f-8b2a-924e6647e12c/volumes" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.698087 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7568b66c46-pphb7" event={"ID":"a5ae6534-550e-4564-8ade-613dfbe1fa32","Type":"ContainerDied","Data":"e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975"} Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.698127 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-564d9f6b4c-5p6js" event={"ID":"28cb2c9b-231d-473b-bbd2-ba4be8c6787e","Type":"ContainerStarted","Data":"f6adeb92428ec5b36df867440b5dd11ae87450350fac497a3b29a82736cd7c76"} Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.698144 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"226f3568-6214-4345-9991-3bda09594c67","Type":"ContainerStarted","Data":"b748e9d5f2a48ef3130f1b83e9c853849cde5c3e234da0cfee6deefdd3e25841"} Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.706059 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.828820 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-combined-ca-bundle\") pod \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.828909 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config-secret\") pod \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.828949 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config\") pod \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.829024 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cmnb\" (UniqueName: \"kubernetes.io/projected/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-kube-api-access-2cmnb\") pod \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\" (UID: \"b24a4e1b-5d34-469f-9a24-6cdd95d93bf7\") " Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.833631 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" (UID: "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.833661 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" (UID: "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.839990 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" (UID: "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.840069 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-kube-api-access-2cmnb" (OuterVolumeSpecName: "kube-api-access-2cmnb") pod "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" (UID: "b24a4e1b-5d34-469f-9a24-6cdd95d93bf7"). InnerVolumeSpecName "kube-api-access-2cmnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.931729 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.931793 4675 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.931805 4675 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4675]: I0320 16:23:26.931813 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cmnb\" (UniqueName: \"kubernetes.io/projected/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7-kube-api-access-2cmnb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.205981 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.206249 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-central-agent" containerID="cri-o://0c8a37fb85c90d253f250f0faeaa438aa9eb85865730c5d201ceb86047b799f4" gracePeriod=30 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.206562 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-notification-agent" containerID="cri-o://09f6434a0f6c781f5df63682f45522e7ecf1b0341fc853292c553ceeed72b65a" gracePeriod=30 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.206635 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="sg-core" containerID="cri-o://dfa8caf51455d0e2e66fb19d48eff9ebc6c39b3ab4ffa7f536da7e1dee06a6df" gracePeriod=30 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.206547 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="proxy-httpd" containerID="cri-o://cb653121160ab646b20f0ee7eb09c16b50d77a374e91186869db35bbefe627ff" gracePeriod=30 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.214099 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.694071 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-564d9f6b4c-5p6js" event={"ID":"28cb2c9b-231d-473b-bbd2-ba4be8c6787e","Type":"ContainerStarted","Data":"b78a268eb98988443a7f744dc98ca99478f2991a6595213e676d835365c7cd5b"} Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.694127 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-564d9f6b4c-5p6js" event={"ID":"28cb2c9b-231d-473b-bbd2-ba4be8c6787e","Type":"ContainerStarted","Data":"e9d8134ef5c0c71b6976a36da30e7a022531bcaa5d7f1ab3e01a43c63723d3d0"} Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.694276 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.694315 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705086 4675 generic.go:334] "Generic (PLEG): container finished" podID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerID="cb653121160ab646b20f0ee7eb09c16b50d77a374e91186869db35bbefe627ff" exitCode=0 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705137 4675 generic.go:334] "Generic (PLEG): container finished" podID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerID="dfa8caf51455d0e2e66fb19d48eff9ebc6c39b3ab4ffa7f536da7e1dee06a6df" exitCode=2 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705145 4675 generic.go:334] "Generic (PLEG): container finished" podID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerID="09f6434a0f6c781f5df63682f45522e7ecf1b0341fc853292c553ceeed72b65a" exitCode=0 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705151 4675 generic.go:334] "Generic (PLEG): container finished" podID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerID="0c8a37fb85c90d253f250f0faeaa438aa9eb85865730c5d201ceb86047b799f4" exitCode=0 Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705160 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerDied","Data":"cb653121160ab646b20f0ee7eb09c16b50d77a374e91186869db35bbefe627ff"} Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705210 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerDied","Data":"dfa8caf51455d0e2e66fb19d48eff9ebc6c39b3ab4ffa7f536da7e1dee06a6df"} Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705227 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerDied","Data":"09f6434a0f6c781f5df63682f45522e7ecf1b0341fc853292c553ceeed72b65a"} Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705232 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.705239 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerDied","Data":"0c8a37fb85c90d253f250f0faeaa438aa9eb85865730c5d201ceb86047b799f4"} Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.713689 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" podUID="226f3568-6214-4345-9991-3bda09594c67" Mar 20 16:23:27 crc kubenswrapper[4675]: I0320 16:23:27.714306 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-564d9f6b4c-5p6js" podStartSLOduration=2.7142904310000002 podStartE2EDuration="2.714290431s" podCreationTimestamp="2026-03-20 16:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:27.710824944 +0000 UTC m=+1327.744454481" watchObservedRunningTime="2026-03-20 16:23:27.714290431 +0000 UTC m=+1327.747919968" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.039522 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154582 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-log-httpd\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154632 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtmzw\" (UniqueName: \"kubernetes.io/projected/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-kube-api-access-mtmzw\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154695 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-sg-core-conf-yaml\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154785 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-scripts\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154844 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-combined-ca-bundle\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154892 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-run-httpd\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.154918 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-config-data\") pod \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\" (UID: \"29172e6f-b7b8-4bcb-ba0c-f13837264cdd\") " Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.155049 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.155874 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.156075 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.173655 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-scripts" (OuterVolumeSpecName: "scripts") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.194032 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.195025 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-kube-api-access-mtmzw" (OuterVolumeSpecName: "kube-api-access-mtmzw") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "kube-api-access-mtmzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.238529 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.258118 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.258162 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.258177 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtmzw\" (UniqueName: \"kubernetes.io/projected/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-kube-api-access-mtmzw\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.258191 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.258204 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.282837 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-config-data" (OuterVolumeSpecName: "config-data") pod "29172e6f-b7b8-4bcb-ba0c-f13837264cdd" (UID: "29172e6f-b7b8-4bcb-ba0c-f13837264cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.360264 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29172e6f-b7b8-4bcb-ba0c-f13837264cdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.687846 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24a4e1b-5d34-469f-9a24-6cdd95d93bf7" path="/var/lib/kubelet/pods/b24a4e1b-5d34-469f-9a24-6cdd95d93bf7/volumes" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.705477 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c454cc68b-lmjfb" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.720398 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29172e6f-b7b8-4bcb-ba0c-f13837264cdd","Type":"ContainerDied","Data":"0f62eb8a5b345a4ddd014d4020dba949777482e9dcb37626960de4a30fe752a6"} Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.720474 4675 scope.go:117] "RemoveContainer" containerID="cb653121160ab646b20f0ee7eb09c16b50d77a374e91186869db35bbefe627ff" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.720431 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.750588 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.750806 4675 scope.go:117] "RemoveContainer" containerID="dfa8caf51455d0e2e66fb19d48eff9ebc6c39b3ab4ffa7f536da7e1dee06a6df" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.784073 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.789324 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:28 crc kubenswrapper[4675]: E0320 16:23:28.789754 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-central-agent" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.789951 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-central-agent" Mar 20 16:23:28 crc kubenswrapper[4675]: E0320 16:23:28.789972 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="sg-core" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.789978 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="sg-core" Mar 20 16:23:28 crc kubenswrapper[4675]: E0320 16:23:28.789992 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-notification-agent" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.790000 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-notification-agent" Mar 20 16:23:28 crc kubenswrapper[4675]: E0320 16:23:28.790017 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="proxy-httpd" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.790023 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="proxy-httpd" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.790197 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-notification-agent" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.790212 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="sg-core" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.790227 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="ceilometer-central-agent" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.790237 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" containerName="proxy-httpd" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.791713 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.794121 4675 scope.go:117] "RemoveContainer" containerID="09f6434a0f6c781f5df63682f45522e7ecf1b0341fc853292c553ceeed72b65a" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.794699 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.801382 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.801463 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.856164 4675 scope.go:117] "RemoveContainer" containerID="0c8a37fb85c90d253f250f0faeaa438aa9eb85865730c5d201ceb86047b799f4" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970538 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970601 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-config-data\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970625 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn4bx\" (UniqueName: \"kubernetes.io/projected/0f56a43e-67e1-4a44-b579-55925e6d8745-kube-api-access-mn4bx\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970643 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-run-httpd\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970672 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-scripts\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970757 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-log-httpd\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970822 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970930 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7568b66c46-pphb7" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:55956->10.217.0.169:9311: read: connection reset by peer" Mar 20 16:23:28 crc kubenswrapper[4675]: I0320 16:23:28.970946 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7568b66c46-pphb7" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:55946->10.217.0.169:9311: read: connection reset by peer" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072610 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-log-httpd\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072650 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072723 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072753 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-config-data\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072799 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn4bx\" (UniqueName: \"kubernetes.io/projected/0f56a43e-67e1-4a44-b579-55925e6d8745-kube-api-access-mn4bx\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072821 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-run-httpd\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.072845 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-scripts\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.073690 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-log-httpd\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.073895 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-run-httpd\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.077448 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.078254 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.089753 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-config-data\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.090463 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-scripts\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.091150 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn4bx\" (UniqueName: \"kubernetes.io/projected/0f56a43e-67e1-4a44-b579-55925e6d8745-kube-api-access-mn4bx\") pod \"ceilometer-0\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.109223 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.315525 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.480635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data-custom\") pod \"a5ae6534-550e-4564-8ade-613dfbe1fa32\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.481098 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-combined-ca-bundle\") pod \"a5ae6534-550e-4564-8ade-613dfbe1fa32\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.481200 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ae6534-550e-4564-8ade-613dfbe1fa32-logs\") pod \"a5ae6534-550e-4564-8ade-613dfbe1fa32\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.481367 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvdth\" (UniqueName: \"kubernetes.io/projected/a5ae6534-550e-4564-8ade-613dfbe1fa32-kube-api-access-lvdth\") pod \"a5ae6534-550e-4564-8ade-613dfbe1fa32\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.481486 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data\") pod \"a5ae6534-550e-4564-8ade-613dfbe1fa32\" (UID: \"a5ae6534-550e-4564-8ade-613dfbe1fa32\") " Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.482429 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ae6534-550e-4564-8ade-613dfbe1fa32-logs" (OuterVolumeSpecName: "logs") pod "a5ae6534-550e-4564-8ade-613dfbe1fa32" (UID: "a5ae6534-550e-4564-8ade-613dfbe1fa32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.487405 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a5ae6534-550e-4564-8ade-613dfbe1fa32" (UID: "a5ae6534-550e-4564-8ade-613dfbe1fa32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.487675 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ae6534-550e-4564-8ade-613dfbe1fa32-kube-api-access-lvdth" (OuterVolumeSpecName: "kube-api-access-lvdth") pod "a5ae6534-550e-4564-8ade-613dfbe1fa32" (UID: "a5ae6534-550e-4564-8ade-613dfbe1fa32"). InnerVolumeSpecName "kube-api-access-lvdth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.512929 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5ae6534-550e-4564-8ade-613dfbe1fa32" (UID: "a5ae6534-550e-4564-8ade-613dfbe1fa32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.529863 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data" (OuterVolumeSpecName: "config-data") pod "a5ae6534-550e-4564-8ade-613dfbe1fa32" (UID: "a5ae6534-550e-4564-8ade-613dfbe1fa32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.583841 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.583876 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ae6534-550e-4564-8ade-613dfbe1fa32-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.583891 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvdth\" (UniqueName: \"kubernetes.io/projected/a5ae6534-550e-4564-8ade-613dfbe1fa32-kube-api-access-lvdth\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.583902 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.583912 4675 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5ae6534-550e-4564-8ade-613dfbe1fa32-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.623076 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.736032 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerStarted","Data":"294909dce7bfae826f7092a9e151765da0a8567f8e1bf07f38699203580b28a8"} Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.740536 4675 generic.go:334] "Generic (PLEG): container finished" podID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerID="05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a" exitCode=0 Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.740573 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7568b66c46-pphb7" event={"ID":"a5ae6534-550e-4564-8ade-613dfbe1fa32","Type":"ContainerDied","Data":"05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a"} Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.740594 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7568b66c46-pphb7" event={"ID":"a5ae6534-550e-4564-8ade-613dfbe1fa32","Type":"ContainerDied","Data":"8611cb742b9eed1767b4244f69ef7878c73f56022b003827c72f43f2cba81a52"} Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.740614 4675 scope.go:117] "RemoveContainer" containerID="05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.740614 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7568b66c46-pphb7" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.765354 4675 scope.go:117] "RemoveContainer" containerID="e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.783102 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7568b66c46-pphb7"] Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.801615 4675 scope.go:117] "RemoveContainer" containerID="05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a" Mar 20 16:23:29 crc kubenswrapper[4675]: E0320 16:23:29.805524 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a\": container with ID starting with 05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a not found: ID does not exist" containerID="05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.805579 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a"} err="failed to get container status \"05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a\": rpc error: code = NotFound desc = could not find container \"05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a\": container with ID starting with 05366b3989bd27d1c0fbb917e1af1c1d7f6f80e38de5f9b1ee40dcbeafd71c3a not found: ID does not exist" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.805611 4675 scope.go:117] "RemoveContainer" containerID="e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.808678 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7568b66c46-pphb7"] Mar 20 16:23:29 crc kubenswrapper[4675]: E0320 16:23:29.809335 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975\": container with ID starting with e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975 not found: ID does not exist" containerID="e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975" Mar 20 16:23:29 crc kubenswrapper[4675]: I0320 16:23:29.809408 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975"} err="failed to get container status \"e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975\": rpc error: code = NotFound desc = could not find container \"e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975\": container with ID starting with e76367ba25a692ee840c8667fe02e3c403234865ad594132425b94b6dfe8e975 not found: ID does not exist" Mar 20 16:23:30 crc kubenswrapper[4675]: I0320 16:23:30.689002 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29172e6f-b7b8-4bcb-ba0c-f13837264cdd" path="/var/lib/kubelet/pods/29172e6f-b7b8-4bcb-ba0c-f13837264cdd/volumes" Mar 20 16:23:30 crc kubenswrapper[4675]: I0320 16:23:30.690645 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" path="/var/lib/kubelet/pods/a5ae6534-550e-4564-8ade-613dfbe1fa32/volumes" Mar 20 16:23:30 crc kubenswrapper[4675]: I0320 16:23:30.756023 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerStarted","Data":"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817"} Mar 20 16:23:31 crc kubenswrapper[4675]: I0320 16:23:31.178500 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 16:23:31 crc kubenswrapper[4675]: I0320 16:23:31.768063 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerStarted","Data":"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9"} Mar 20 16:23:34 crc kubenswrapper[4675]: I0320 16:23:34.424782 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:23:34 crc kubenswrapper[4675]: I0320 16:23:34.425189 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:23:35 crc kubenswrapper[4675]: I0320 16:23:35.839454 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:35 crc kubenswrapper[4675]: I0320 16:23:35.842218 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-564d9f6b4c-5p6js" Mar 20 16:23:37 crc kubenswrapper[4675]: I0320 16:23:37.772868 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:38 crc kubenswrapper[4675]: I0320 16:23:38.705608 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c454cc68b-lmjfb" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 16:23:38 crc kubenswrapper[4675]: I0320 16:23:38.706012 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:23:38 crc kubenswrapper[4675]: I0320 16:23:38.834193 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"226f3568-6214-4345-9991-3bda09594c67","Type":"ContainerStarted","Data":"a43d725b729af551229fa5d60b3760853afb5df2202507d0098dac99bb2fc8b8"} Mar 20 16:23:38 crc kubenswrapper[4675]: I0320 16:23:38.837102 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerStarted","Data":"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb"} Mar 20 16:23:38 crc kubenswrapper[4675]: I0320 16:23:38.854575 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.176400483 podStartE2EDuration="13.854559149s" podCreationTimestamp="2026-03-20 16:23:25 +0000 UTC" firstStartedPulling="2026-03-20 16:23:26.649134393 +0000 UTC m=+1326.682763930" lastFinishedPulling="2026-03-20 16:23:38.327293059 +0000 UTC m=+1338.360922596" observedRunningTime="2026-03-20 16:23:38.847736789 +0000 UTC m=+1338.881366326" watchObservedRunningTime="2026-03-20 16:23:38.854559149 +0000 UTC m=+1338.888188686" Mar 20 16:23:40 crc kubenswrapper[4675]: I0320 16:23:40.299265 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f9ffbdb49-tcvn8" Mar 20 16:23:40 crc kubenswrapper[4675]: I0320 16:23:40.394374 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-865b456f44-shc9z"] Mar 20 16:23:40 crc kubenswrapper[4675]: I0320 16:23:40.394839 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-865b456f44-shc9z" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-api" containerID="cri-o://fae7ab551b9c0037632bcaffa816c711fb33b545e85c0e139da1661897c23f42" gracePeriod=30 Mar 20 16:23:40 crc kubenswrapper[4675]: I0320 16:23:40.395503 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-865b456f44-shc9z" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-httpd" containerID="cri-o://e41dc63a82c84b7e7442f52147268d76150695d914c4d93e30a5de24cdca7695" gracePeriod=30 Mar 20 16:23:40 crc kubenswrapper[4675]: I0320 16:23:40.854109 4675 generic.go:334] "Generic (PLEG): container finished" podID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerID="e41dc63a82c84b7e7442f52147268d76150695d914c4d93e30a5de24cdca7695" exitCode=0 Mar 20 16:23:40 crc kubenswrapper[4675]: I0320 16:23:40.854165 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-865b456f44-shc9z" event={"ID":"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58","Type":"ContainerDied","Data":"e41dc63a82c84b7e7442f52147268d76150695d914c4d93e30a5de24cdca7695"} Mar 20 16:23:41 crc kubenswrapper[4675]: I0320 16:23:41.899431 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerStarted","Data":"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee"} Mar 20 16:23:41 crc kubenswrapper[4675]: I0320 16:23:41.899580 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-central-agent" containerID="cri-o://a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" gracePeriod=30 Mar 20 16:23:41 crc kubenswrapper[4675]: I0320 16:23:41.899854 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:23:41 crc kubenswrapper[4675]: I0320 16:23:41.900001 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="sg-core" containerID="cri-o://d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" gracePeriod=30 Mar 20 16:23:41 crc kubenswrapper[4675]: I0320 16:23:41.900078 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="proxy-httpd" containerID="cri-o://e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" gracePeriod=30 Mar 20 16:23:41 crc kubenswrapper[4675]: I0320 16:23:41.900153 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-notification-agent" containerID="cri-o://5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" gracePeriod=30 Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.734234 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827266 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-scripts\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827355 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-combined-ca-bundle\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827382 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-sg-core-conf-yaml\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827410 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn4bx\" (UniqueName: \"kubernetes.io/projected/0f56a43e-67e1-4a44-b579-55925e6d8745-kube-api-access-mn4bx\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827504 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-run-httpd\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827524 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-log-httpd\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.827589 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-config-data\") pod \"0f56a43e-67e1-4a44-b579-55925e6d8745\" (UID: \"0f56a43e-67e1-4a44-b579-55925e6d8745\") " Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.829154 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.829294 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.840455 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-scripts" (OuterVolumeSpecName: "scripts") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.860186 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f56a43e-67e1-4a44-b579-55925e6d8745-kube-api-access-mn4bx" (OuterVolumeSpecName: "kube-api-access-mn4bx") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "kube-api-access-mn4bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.922992 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924373 4675 generic.go:334] "Generic (PLEG): container finished" podID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" exitCode=0 Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924409 4675 generic.go:334] "Generic (PLEG): container finished" podID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" exitCode=2 Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924422 4675 generic.go:334] "Generic (PLEG): container finished" podID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" exitCode=0 Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924431 4675 generic.go:334] "Generic (PLEG): container finished" podID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" exitCode=0 Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924454 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerDied","Data":"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee"} Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924482 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerDied","Data":"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb"} Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924494 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerDied","Data":"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9"} Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924504 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerDied","Data":"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817"} Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924514 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f56a43e-67e1-4a44-b579-55925e6d8745","Type":"ContainerDied","Data":"294909dce7bfae826f7092a9e151765da0a8567f8e1bf07f38699203580b28a8"} Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924531 4675 scope.go:117] "RemoveContainer" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.924687 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.930559 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.930587 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f56a43e-67e1-4a44-b579-55925e6d8745-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.930598 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.930637 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.930650 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn4bx\" (UniqueName: \"kubernetes.io/projected/0f56a43e-67e1-4a44-b579-55925e6d8745-kube-api-access-mn4bx\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.969953 4675 scope.go:117] "RemoveContainer" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" Mar 20 16:23:42 crc kubenswrapper[4675]: I0320 16:23:42.996937 4675 scope.go:117] "RemoveContainer" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.000497 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.022169 4675 scope.go:117] "RemoveContainer" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.022266 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-config-data" (OuterVolumeSpecName: "config-data") pod "0f56a43e-67e1-4a44-b579-55925e6d8745" (UID: "0f56a43e-67e1-4a44-b579-55925e6d8745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.032832 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.032867 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f56a43e-67e1-4a44-b579-55925e6d8745-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.102157 4675 scope.go:117] "RemoveContainer" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.102707 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": container with ID starting with e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee not found: ID does not exist" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.102751 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee"} err="failed to get container status \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": rpc error: code = NotFound desc = could not find container \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": container with ID starting with e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.102790 4675 scope.go:117] "RemoveContainer" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.103318 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": container with ID starting with d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb not found: ID does not exist" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.103362 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb"} err="failed to get container status \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": rpc error: code = NotFound desc = could not find container \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": container with ID starting with d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.103394 4675 scope.go:117] "RemoveContainer" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.104044 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": container with ID starting with 5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9 not found: ID does not exist" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.104072 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9"} err="failed to get container status \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": rpc error: code = NotFound desc = could not find container \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": container with ID starting with 5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.104091 4675 scope.go:117] "RemoveContainer" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.104552 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": container with ID starting with a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817 not found: ID does not exist" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.104587 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817"} err="failed to get container status \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": rpc error: code = NotFound desc = could not find container \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": container with ID starting with a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.104611 4675 scope.go:117] "RemoveContainer" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.104946 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee"} err="failed to get container status \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": rpc error: code = NotFound desc = could not find container \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": container with ID starting with e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.104970 4675 scope.go:117] "RemoveContainer" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.107685 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb"} err="failed to get container status \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": rpc error: code = NotFound desc = could not find container \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": container with ID starting with d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.107718 4675 scope.go:117] "RemoveContainer" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.111116 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9"} err="failed to get container status \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": rpc error: code = NotFound desc = could not find container \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": container with ID starting with 5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.111185 4675 scope.go:117] "RemoveContainer" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.112203 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817"} err="failed to get container status \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": rpc error: code = NotFound desc = could not find container \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": container with ID starting with a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.112231 4675 scope.go:117] "RemoveContainer" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.112603 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee"} err="failed to get container status \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": rpc error: code = NotFound desc = could not find container \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": container with ID starting with e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.112648 4675 scope.go:117] "RemoveContainer" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.113096 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb"} err="failed to get container status \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": rpc error: code = NotFound desc = could not find container \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": container with ID starting with d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.113124 4675 scope.go:117] "RemoveContainer" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.113441 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9"} err="failed to get container status \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": rpc error: code = NotFound desc = could not find container \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": container with ID starting with 5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.113465 4675 scope.go:117] "RemoveContainer" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.113779 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817"} err="failed to get container status \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": rpc error: code = NotFound desc = could not find container \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": container with ID starting with a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.113806 4675 scope.go:117] "RemoveContainer" containerID="e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.114415 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee"} err="failed to get container status \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": rpc error: code = NotFound desc = could not find container \"e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee\": container with ID starting with e84b28e22a96c7c587e8a0989351b0e7430c8a7dbc42eb8504311b4fcd2c46ee not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.114450 4675 scope.go:117] "RemoveContainer" containerID="d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.114722 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb"} err="failed to get container status \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": rpc error: code = NotFound desc = could not find container \"d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb\": container with ID starting with d6a9e5bec6099ced8d460dc2b857a8632af48bcb8ae707cae60aaa0ef08bc2fb not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.114746 4675 scope.go:117] "RemoveContainer" containerID="5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.115026 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9"} err="failed to get container status \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": rpc error: code = NotFound desc = could not find container \"5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9\": container with ID starting with 5258ceaf815c6e7b07b21307074a9d2af8ff00c103baa6acaeaa677191dec8e9 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.115055 4675 scope.go:117] "RemoveContainer" containerID="a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.115306 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817"} err="failed to get container status \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": rpc error: code = NotFound desc = could not find container \"a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817\": container with ID starting with a1a946af087ee752cc81dd848e3fcc95ac2fd7137adcad3ef7293c5344c3e817 not found: ID does not exist" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.267943 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.280172 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.293405 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.293903 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="sg-core" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.293931 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="sg-core" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.293954 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-notification-agent" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.293964 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-notification-agent" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.293977 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.293988 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.294010 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-central-agent" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294018 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-central-agent" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.294037 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="proxy-httpd" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294045 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="proxy-httpd" Mar 20 16:23:43 crc kubenswrapper[4675]: E0320 16:23:43.294062 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api-log" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294072 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api-log" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294298 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api-log" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294328 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-notification-agent" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294340 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="ceilometer-central-agent" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294350 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="sg-core" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294368 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ae6534-550e-4564-8ade-613dfbe1fa32" containerName="barbican-api" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.294380 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" containerName="proxy-httpd" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.296465 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.298724 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.298750 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.307216 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440455 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440571 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-scripts\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440636 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440741 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkgp8\" (UniqueName: \"kubernetes.io/projected/3df76c2e-6431-42a0-a702-6a9a9987f8c9-kube-api-access-kkgp8\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440833 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-run-httpd\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440893 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-log-httpd\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.440979 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-config-data\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.543268 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-run-httpd\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544243 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-log-httpd\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544382 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-config-data\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544589 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544691 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-scripts\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544820 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544957 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkgp8\" (UniqueName: \"kubernetes.io/projected/3df76c2e-6431-42a0-a702-6a9a9987f8c9-kube-api-access-kkgp8\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.544169 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-run-httpd\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.545888 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-log-httpd\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.548987 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-scripts\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.549342 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-config-data\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.549601 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.549845 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.563354 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkgp8\" (UniqueName: \"kubernetes.io/projected/3df76c2e-6431-42a0-a702-6a9a9987f8c9-kube-api-access-kkgp8\") pod \"ceilometer-0\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " pod="openstack/ceilometer-0" Mar 20 16:23:43 crc kubenswrapper[4675]: I0320 16:23:43.613407 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.048022 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.408134 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9kb49"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.410171 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.433719 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9kb49"] Mar 20 16:23:44 crc kubenswrapper[4675]: W0320 16:23:44.465008 4675 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f56a43e_67e1_4a44_b579_55925e6d8745.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f56a43e_67e1_4a44_b579_55925e6d8745.slice: no such file or directory Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.563684 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg2j\" (UniqueName: \"kubernetes.io/projected/93745d9e-fb27-46b1-9305-de6265b0cc8d-kube-api-access-cwg2j\") pod \"nova-api-db-create-9kb49\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.563752 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93745d9e-fb27-46b1-9305-de6265b0cc8d-operator-scripts\") pod \"nova-api-db-create-9kb49\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.618545 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pkw66"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.619751 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.657159 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pkw66"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.674811 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg2j\" (UniqueName: \"kubernetes.io/projected/93745d9e-fb27-46b1-9305-de6265b0cc8d-kube-api-access-cwg2j\") pod \"nova-api-db-create-9kb49\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.675066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93745d9e-fb27-46b1-9305-de6265b0cc8d-operator-scripts\") pod \"nova-api-db-create-9kb49\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.676077 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93745d9e-fb27-46b1-9305-de6265b0cc8d-operator-scripts\") pod \"nova-api-db-create-9kb49\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.746466 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg2j\" (UniqueName: \"kubernetes.io/projected/93745d9e-fb27-46b1-9305-de6265b0cc8d-kube-api-access-cwg2j\") pod \"nova-api-db-create-9kb49\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.751228 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f56a43e-67e1-4a44-b579-55925e6d8745" path="/var/lib/kubelet/pods/0f56a43e-67e1-4a44-b579-55925e6d8745/volumes" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.759019 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-295ch"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.761180 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.780444 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d340bca-db9e-4748-9cad-c3856ffe6edf-operator-scripts\") pod \"nova-cell0-db-create-pkw66\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.780557 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxgl\" (UniqueName: \"kubernetes.io/projected/6d340bca-db9e-4748-9cad-c3856ffe6edf-kube-api-access-8dxgl\") pod \"nova-cell0-db-create-pkw66\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.797229 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-295ch"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.822794 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fe47-account-create-update-gnq8t"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.824245 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.845178 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.850485 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe47-account-create-update-gnq8t"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.884059 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d340bca-db9e-4748-9cad-c3856ffe6edf-operator-scripts\") pod \"nova-cell0-db-create-pkw66\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.884357 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxgl\" (UniqueName: \"kubernetes.io/projected/6d340bca-db9e-4748-9cad-c3856ffe6edf-kube-api-access-8dxgl\") pod \"nova-cell0-db-create-pkw66\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.884561 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcc971d1-4036-4338-80ba-8f4f00c10b2a-operator-scripts\") pod \"nova-cell1-db-create-295ch\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.884680 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbknn\" (UniqueName: \"kubernetes.io/projected/bcc971d1-4036-4338-80ba-8f4f00c10b2a-kube-api-access-xbknn\") pod \"nova-cell1-db-create-295ch\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.885508 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d340bca-db9e-4748-9cad-c3856ffe6edf-operator-scripts\") pod \"nova-cell0-db-create-pkw66\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.886256 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:44 crc kubenswrapper[4675]: E0320 16:23:44.901985 4675 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e972387_c641_42bd_9c3f_69fc70869c8a.slice/crio-conmon-6186dbac985c54733a1640322c538c03ab0fe9a306fd411d2f745642ee54de5f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.907471 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f9fc-account-create-update-qwrrp"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.908137 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxgl\" (UniqueName: \"kubernetes.io/projected/6d340bca-db9e-4748-9cad-c3856ffe6edf-kube-api-access-8dxgl\") pod \"nova-cell0-db-create-pkw66\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.908550 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.911126 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.932556 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f9fc-account-create-update-qwrrp"] Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.960078 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerStarted","Data":"c34fc647a4792150b09e4f3f1419508d3efa9f61fb4f93fdab670eda62811976"} Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.962911 4675 generic.go:334] "Generic (PLEG): container finished" podID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerID="6186dbac985c54733a1640322c538c03ab0fe9a306fd411d2f745642ee54de5f" exitCode=137 Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.962964 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c454cc68b-lmjfb" event={"ID":"3e972387-c641-42bd-9c3f-69fc70869c8a","Type":"ContainerDied","Data":"6186dbac985c54733a1640322c538c03ab0fe9a306fd411d2f745642ee54de5f"} Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.962992 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c454cc68b-lmjfb" event={"ID":"3e972387-c641-42bd-9c3f-69fc70869c8a","Type":"ContainerDied","Data":"5d3bbd99939ccc36e5f13f7951334d98a990c57183616b6d2b3dc2a1f206ca3f"} Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.963005 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3bbd99939ccc36e5f13f7951334d98a990c57183616b6d2b3dc2a1f206ca3f" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.987591 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-operator-scripts\") pod \"nova-api-fe47-account-create-update-gnq8t\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.988116 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9r8v\" (UniqueName: \"kubernetes.io/projected/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-kube-api-access-h9r8v\") pod \"nova-api-fe47-account-create-update-gnq8t\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.988187 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcc971d1-4036-4338-80ba-8f4f00c10b2a-operator-scripts\") pod \"nova-cell1-db-create-295ch\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.988229 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbknn\" (UniqueName: \"kubernetes.io/projected/bcc971d1-4036-4338-80ba-8f4f00c10b2a-kube-api-access-xbknn\") pod \"nova-cell1-db-create-295ch\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.988377 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq5rj\" (UniqueName: \"kubernetes.io/projected/474cfa15-2932-4b81-a00e-fc9c6648e91b-kube-api-access-gq5rj\") pod \"nova-cell0-f9fc-account-create-update-qwrrp\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.988578 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/474cfa15-2932-4b81-a00e-fc9c6648e91b-operator-scripts\") pod \"nova-cell0-f9fc-account-create-update-qwrrp\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:44 crc kubenswrapper[4675]: I0320 16:23:44.990006 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcc971d1-4036-4338-80ba-8f4f00c10b2a-operator-scripts\") pod \"nova-cell1-db-create-295ch\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.007872 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbknn\" (UniqueName: \"kubernetes.io/projected/bcc971d1-4036-4338-80ba-8f4f00c10b2a-kube-api-access-xbknn\") pod \"nova-cell1-db-create-295ch\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.032703 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.033190 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.092572 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq5rj\" (UniqueName: \"kubernetes.io/projected/474cfa15-2932-4b81-a00e-fc9c6648e91b-kube-api-access-gq5rj\") pod \"nova-cell0-f9fc-account-create-update-qwrrp\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.092741 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/474cfa15-2932-4b81-a00e-fc9c6648e91b-operator-scripts\") pod \"nova-cell0-f9fc-account-create-update-qwrrp\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.092833 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-operator-scripts\") pod \"nova-api-fe47-account-create-update-gnq8t\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.092907 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9r8v\" (UniqueName: \"kubernetes.io/projected/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-kube-api-access-h9r8v\") pod \"nova-api-fe47-account-create-update-gnq8t\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.096113 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/474cfa15-2932-4b81-a00e-fc9c6648e91b-operator-scripts\") pod \"nova-cell0-f9fc-account-create-update-qwrrp\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.103348 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-operator-scripts\") pod \"nova-api-fe47-account-create-update-gnq8t\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.125443 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq5rj\" (UniqueName: \"kubernetes.io/projected/474cfa15-2932-4b81-a00e-fc9c6648e91b-kube-api-access-gq5rj\") pod \"nova-cell0-f9fc-account-create-update-qwrrp\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.135669 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9r8v\" (UniqueName: \"kubernetes.io/projected/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-kube-api-access-h9r8v\") pod \"nova-api-fe47-account-create-update-gnq8t\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.140477 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6ae5-account-create-update-7mtsv"] Mar 20 16:23:45 crc kubenswrapper[4675]: E0320 16:23:45.140928 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon-log" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.140942 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon-log" Mar 20 16:23:45 crc kubenswrapper[4675]: E0320 16:23:45.140983 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.140992 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.141213 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon-log" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.141228 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" containerName="horizon" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.145990 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.148165 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.157413 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6ae5-account-create-update-7mtsv"] Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195330 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-config-data\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195539 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-tls-certs\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195613 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-combined-ca-bundle\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195673 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b7rf\" (UniqueName: \"kubernetes.io/projected/3e972387-c641-42bd-9c3f-69fc70869c8a-kube-api-access-2b7rf\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195703 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-scripts\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195733 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e972387-c641-42bd-9c3f-69fc70869c8a-logs\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.195752 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-secret-key\") pod \"3e972387-c641-42bd-9c3f-69fc70869c8a\" (UID: \"3e972387-c641-42bd-9c3f-69fc70869c8a\") " Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.203414 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e972387-c641-42bd-9c3f-69fc70869c8a-kube-api-access-2b7rf" (OuterVolumeSpecName: "kube-api-access-2b7rf") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "kube-api-access-2b7rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.206215 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e972387-c641-42bd-9c3f-69fc70869c8a-logs" (OuterVolumeSpecName: "logs") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.247355 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.247430 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-config-data" (OuterVolumeSpecName: "config-data") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.263752 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-scripts" (OuterVolumeSpecName: "scripts") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.265862 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.290968 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "3e972387-c641-42bd-9c3f-69fc70869c8a" (UID: "3e972387-c641-42bd-9c3f-69fc70869c8a"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.299163 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18287f0-a719-4ea8-badd-3f2f13bd4209-operator-scripts\") pod \"nova-cell1-6ae5-account-create-update-7mtsv\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.299302 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snnsg\" (UniqueName: \"kubernetes.io/projected/a18287f0-a719-4ea8-badd-3f2f13bd4209-kube-api-access-snnsg\") pod \"nova-cell1-6ae5-account-create-update-7mtsv\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.299661 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300018 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b7rf\" (UniqueName: \"kubernetes.io/projected/3e972387-c641-42bd-9c3f-69fc70869c8a-kube-api-access-2b7rf\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300438 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300456 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e972387-c641-42bd-9c3f-69fc70869c8a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300468 4675 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300478 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e972387-c641-42bd-9c3f-69fc70869c8a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300490 4675 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.300501 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e972387-c641-42bd-9c3f-69fc70869c8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.302513 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.328577 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.402407 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snnsg\" (UniqueName: \"kubernetes.io/projected/a18287f0-a719-4ea8-badd-3f2f13bd4209-kube-api-access-snnsg\") pod \"nova-cell1-6ae5-account-create-update-7mtsv\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.402562 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18287f0-a719-4ea8-badd-3f2f13bd4209-operator-scripts\") pod \"nova-cell1-6ae5-account-create-update-7mtsv\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.403325 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18287f0-a719-4ea8-badd-3f2f13bd4209-operator-scripts\") pod \"nova-cell1-6ae5-account-create-update-7mtsv\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.429757 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snnsg\" (UniqueName: \"kubernetes.io/projected/a18287f0-a719-4ea8-badd-3f2f13bd4209-kube-api-access-snnsg\") pod \"nova-cell1-6ae5-account-create-update-7mtsv\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.458142 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9kb49"] Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.614281 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pkw66"] Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.614661 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:45 crc kubenswrapper[4675]: W0320 16:23:45.632821 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d340bca_db9e_4748_9cad_c3856ffe6edf.slice/crio-2a64f25bd14ea4e8ff5fabd7954a8e9c597c37ccbffc69f906b01c6cdc575553 WatchSource:0}: Error finding container 2a64f25bd14ea4e8ff5fabd7954a8e9c597c37ccbffc69f906b01c6cdc575553: Status 404 returned error can't find the container with id 2a64f25bd14ea4e8ff5fabd7954a8e9c597c37ccbffc69f906b01c6cdc575553 Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.969698 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-295ch"] Mar 20 16:23:45 crc kubenswrapper[4675]: I0320 16:23:45.983917 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkw66" event={"ID":"6d340bca-db9e-4748-9cad-c3856ffe6edf","Type":"ContainerStarted","Data":"2a64f25bd14ea4e8ff5fabd7954a8e9c597c37ccbffc69f906b01c6cdc575553"} Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.004817 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f9fc-account-create-update-qwrrp"] Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.021591 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerStarted","Data":"4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4"} Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.041708 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c454cc68b-lmjfb" Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.041790 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kb49" event={"ID":"93745d9e-fb27-46b1-9305-de6265b0cc8d","Type":"ContainerStarted","Data":"068b2f6a922b56269ea7c9718d948de23081e7141c862895b5504c4eed5657d9"} Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.185185 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe47-account-create-update-gnq8t"] Mar 20 16:23:46 crc kubenswrapper[4675]: W0320 16:23:46.189150 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbfc2b2e_09f6_4be6_a4d2_9c0fe1d4d411.slice/crio-2fdd9903f74b77a1bb06ada02764fef5659ec913a8d61d39a1c4f5e23f258b51 WatchSource:0}: Error finding container 2fdd9903f74b77a1bb06ada02764fef5659ec913a8d61d39a1c4f5e23f258b51: Status 404 returned error can't find the container with id 2fdd9903f74b77a1bb06ada02764fef5659ec913a8d61d39a1c4f5e23f258b51 Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.319742 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c454cc68b-lmjfb"] Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.326065 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c454cc68b-lmjfb"] Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.363197 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6ae5-account-create-update-7mtsv"] Mar 20 16:23:46 crc kubenswrapper[4675]: I0320 16:23:46.687093 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e972387-c641-42bd-9c3f-69fc70869c8a" path="/var/lib/kubelet/pods/3e972387-c641-42bd-9c3f-69fc70869c8a/volumes" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.056794 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" event={"ID":"a18287f0-a719-4ea8-badd-3f2f13bd4209","Type":"ContainerStarted","Data":"545dee13ff9a3eafbd98459fc2690aca565d5de139fcb60d43e815c0e7d5f879"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.056865 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" event={"ID":"a18287f0-a719-4ea8-badd-3f2f13bd4209","Type":"ContainerStarted","Data":"8fa10751f983ab0721443606ed204089ab93c750fc0102015037bff864cb0b64"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.059091 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-295ch" event={"ID":"bcc971d1-4036-4338-80ba-8f4f00c10b2a","Type":"ContainerStarted","Data":"42085d7c9d650b55c53543050b27eabe170a4e173ca53a32e847bbea569ba20f"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.059121 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-295ch" event={"ID":"bcc971d1-4036-4338-80ba-8f4f00c10b2a","Type":"ContainerStarted","Data":"f19f1636ab9b2c07daa3d1f4fe8185cb8b183b5cfa8acc65d7139adfe335bd14"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.066406 4675 generic.go:334] "Generic (PLEG): container finished" podID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerID="fae7ab551b9c0037632bcaffa816c711fb33b545e85c0e139da1661897c23f42" exitCode=0 Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.066547 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-865b456f44-shc9z" event={"ID":"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58","Type":"ContainerDied","Data":"fae7ab551b9c0037632bcaffa816c711fb33b545e85c0e139da1661897c23f42"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.069020 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe47-account-create-update-gnq8t" event={"ID":"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411","Type":"ContainerStarted","Data":"565e9443c1e2334943f608354e27faef29f650b21304a7a47a30de30074de19f"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.069271 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe47-account-create-update-gnq8t" event={"ID":"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411","Type":"ContainerStarted","Data":"2fdd9903f74b77a1bb06ada02764fef5659ec913a8d61d39a1c4f5e23f258b51"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.071331 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kb49" event={"ID":"93745d9e-fb27-46b1-9305-de6265b0cc8d","Type":"ContainerStarted","Data":"ec3ea6e5cb1890c029df8e139003b98a95136a354544197bfc0b5b7cc9dbbbf2"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.074619 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" event={"ID":"474cfa15-2932-4b81-a00e-fc9c6648e91b","Type":"ContainerStarted","Data":"1cd32ac5e10f7a48435c9cb3c618fabc8b405051c33448b1ed80c1fdeff879d9"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.074676 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" event={"ID":"474cfa15-2932-4b81-a00e-fc9c6648e91b","Type":"ContainerStarted","Data":"ef6e249b278006d6d0e9a5627d70534eb1685290cd1d8b08807ca8ebb38ecc67"} Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.090438 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9kb49" podStartSLOduration=3.090420156 podStartE2EDuration="3.090420156s" podCreationTimestamp="2026-03-20 16:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:47.089049348 +0000 UTC m=+1347.122678885" watchObservedRunningTime="2026-03-20 16:23:47.090420156 +0000 UTC m=+1347.124049693" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.117398 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-pkw66" podStartSLOduration=3.117373026 podStartE2EDuration="3.117373026s" podCreationTimestamp="2026-03-20 16:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:47.107921123 +0000 UTC m=+1347.141550660" watchObservedRunningTime="2026-03-20 16:23:47.117373026 +0000 UTC m=+1347.151002563" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.292377 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.292968 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-log" containerID="cri-o://03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd" gracePeriod=30 Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.293459 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-httpd" containerID="cri-o://3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630" gracePeriod=30 Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.654596 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.774784 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-ovndb-tls-certs\") pod \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.774841 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-combined-ca-bundle\") pod \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.774937 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-httpd-config\") pod \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.774976 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-config\") pod \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.775022 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrxf4\" (UniqueName: \"kubernetes.io/projected/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-kube-api-access-vrxf4\") pod \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\" (UID: \"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58\") " Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.780373 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" (UID: "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.786084 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-kube-api-access-vrxf4" (OuterVolumeSpecName: "kube-api-access-vrxf4") pod "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" (UID: "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58"). InnerVolumeSpecName "kube-api-access-vrxf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.834875 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-config" (OuterVolumeSpecName: "config") pod "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" (UID: "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.838346 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" (UID: "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.862031 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" (UID: "05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.878523 4675 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.878558 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.878570 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.878581 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:47 crc kubenswrapper[4675]: I0320 16:23:47.878593 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrxf4\" (UniqueName: \"kubernetes.io/projected/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58-kube-api-access-vrxf4\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.093207 4675 generic.go:334] "Generic (PLEG): container finished" podID="93745d9e-fb27-46b1-9305-de6265b0cc8d" containerID="ec3ea6e5cb1890c029df8e139003b98a95136a354544197bfc0b5b7cc9dbbbf2" exitCode=0 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.093364 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kb49" event={"ID":"93745d9e-fb27-46b1-9305-de6265b0cc8d","Type":"ContainerDied","Data":"ec3ea6e5cb1890c029df8e139003b98a95136a354544197bfc0b5b7cc9dbbbf2"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.097079 4675 generic.go:334] "Generic (PLEG): container finished" podID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerID="03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd" exitCode=143 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.097144 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9de8c7f-b5e1-4222-ac09-24fa8e26b089","Type":"ContainerDied","Data":"03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.103333 4675 generic.go:334] "Generic (PLEG): container finished" podID="bcc971d1-4036-4338-80ba-8f4f00c10b2a" containerID="42085d7c9d650b55c53543050b27eabe170a4e173ca53a32e847bbea569ba20f" exitCode=0 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.103434 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-295ch" event={"ID":"bcc971d1-4036-4338-80ba-8f4f00c10b2a","Type":"ContainerDied","Data":"42085d7c9d650b55c53543050b27eabe170a4e173ca53a32e847bbea569ba20f"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.107088 4675 generic.go:334] "Generic (PLEG): container finished" podID="a18287f0-a719-4ea8-badd-3f2f13bd4209" containerID="545dee13ff9a3eafbd98459fc2690aca565d5de139fcb60d43e815c0e7d5f879" exitCode=0 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.107172 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" event={"ID":"a18287f0-a719-4ea8-badd-3f2f13bd4209","Type":"ContainerDied","Data":"545dee13ff9a3eafbd98459fc2690aca565d5de139fcb60d43e815c0e7d5f879"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.108800 4675 generic.go:334] "Generic (PLEG): container finished" podID="dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" containerID="565e9443c1e2334943f608354e27faef29f650b21304a7a47a30de30074de19f" exitCode=0 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.108870 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe47-account-create-update-gnq8t" event={"ID":"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411","Type":"ContainerDied","Data":"565e9443c1e2334943f608354e27faef29f650b21304a7a47a30de30074de19f"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.110426 4675 generic.go:334] "Generic (PLEG): container finished" podID="474cfa15-2932-4b81-a00e-fc9c6648e91b" containerID="1cd32ac5e10f7a48435c9cb3c618fabc8b405051c33448b1ed80c1fdeff879d9" exitCode=0 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.110457 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" event={"ID":"474cfa15-2932-4b81-a00e-fc9c6648e91b","Type":"ContainerDied","Data":"1cd32ac5e10f7a48435c9cb3c618fabc8b405051c33448b1ed80c1fdeff879d9"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.112971 4675 generic.go:334] "Generic (PLEG): container finished" podID="6d340bca-db9e-4748-9cad-c3856ffe6edf" containerID="2ab4bebb194df8176b7f0261b07407260f70b70ef3699220892ddbcedfa0db9a" exitCode=0 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.113054 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkw66" event={"ID":"6d340bca-db9e-4748-9cad-c3856ffe6edf","Type":"ContainerDied","Data":"2ab4bebb194df8176b7f0261b07407260f70b70ef3699220892ddbcedfa0db9a"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.116719 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerStarted","Data":"33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.118889 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-865b456f44-shc9z" event={"ID":"05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58","Type":"ContainerDied","Data":"ec91cda1e5994f49c175a09687a704fd8ac89307d9b6d622f52124f69dbf9964"} Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.118922 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-865b456f44-shc9z" Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.118943 4675 scope.go:117] "RemoveContainer" containerID="e41dc63a82c84b7e7442f52147268d76150695d914c4d93e30a5de24cdca7695" Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.150172 4675 scope.go:117] "RemoveContainer" containerID="fae7ab551b9c0037632bcaffa816c711fb33b545e85c0e139da1661897c23f42" Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.228929 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-865b456f44-shc9z"] Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.239198 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-865b456f44-shc9z"] Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.625676 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.625933 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-log" containerID="cri-o://ab5b58b6bf2e1a2e52e3f98777d2d6b997308d88bcc779a5ba96a6dda56d377e" gracePeriod=30 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.626085 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-httpd" containerID="cri-o://d866f3970e9e9bc778662ccdfa3abf3d53ca0536f21f70f0a0b5c984d37bae6f" gracePeriod=30 Mar 20 16:23:48 crc kubenswrapper[4675]: I0320 16:23:48.684352 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" path="/var/lib/kubelet/pods/05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58/volumes" Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.129114 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerStarted","Data":"9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536"} Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.132450 4675 generic.go:334] "Generic (PLEG): container finished" podID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerID="ab5b58b6bf2e1a2e52e3f98777d2d6b997308d88bcc779a5ba96a6dda56d377e" exitCode=143 Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.132658 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4f4c66-7029-49ad-aa71-38faa62d3178","Type":"ContainerDied","Data":"ab5b58b6bf2e1a2e52e3f98777d2d6b997308d88bcc779a5ba96a6dda56d377e"} Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.720746 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.835439 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dxgl\" (UniqueName: \"kubernetes.io/projected/6d340bca-db9e-4748-9cad-c3856ffe6edf-kube-api-access-8dxgl\") pod \"6d340bca-db9e-4748-9cad-c3856ffe6edf\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.835520 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d340bca-db9e-4748-9cad-c3856ffe6edf-operator-scripts\") pod \"6d340bca-db9e-4748-9cad-c3856ffe6edf\" (UID: \"6d340bca-db9e-4748-9cad-c3856ffe6edf\") " Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.836559 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d340bca-db9e-4748-9cad-c3856ffe6edf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d340bca-db9e-4748-9cad-c3856ffe6edf" (UID: "6d340bca-db9e-4748-9cad-c3856ffe6edf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.848508 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d340bca-db9e-4748-9cad-c3856ffe6edf-kube-api-access-8dxgl" (OuterVolumeSpecName: "kube-api-access-8dxgl") pod "6d340bca-db9e-4748-9cad-c3856ffe6edf" (UID: "6d340bca-db9e-4748-9cad-c3856ffe6edf"). InnerVolumeSpecName "kube-api-access-8dxgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.939678 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dxgl\" (UniqueName: \"kubernetes.io/projected/6d340bca-db9e-4748-9cad-c3856ffe6edf-kube-api-access-8dxgl\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.939710 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d340bca-db9e-4748-9cad-c3856ffe6edf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4675]: I0320 16:23:49.998897 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.012509 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.023334 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.031053 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.053496 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.141702 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/474cfa15-2932-4b81-a00e-fc9c6648e91b-operator-scripts\") pod \"474cfa15-2932-4b81-a00e-fc9c6648e91b\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.141830 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18287f0-a719-4ea8-badd-3f2f13bd4209-operator-scripts\") pod \"a18287f0-a719-4ea8-badd-3f2f13bd4209\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.141869 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq5rj\" (UniqueName: \"kubernetes.io/projected/474cfa15-2932-4b81-a00e-fc9c6648e91b-kube-api-access-gq5rj\") pod \"474cfa15-2932-4b81-a00e-fc9c6648e91b\" (UID: \"474cfa15-2932-4b81-a00e-fc9c6648e91b\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.141949 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbknn\" (UniqueName: \"kubernetes.io/projected/bcc971d1-4036-4338-80ba-8f4f00c10b2a-kube-api-access-xbknn\") pod \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.141986 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snnsg\" (UniqueName: \"kubernetes.io/projected/a18287f0-a719-4ea8-badd-3f2f13bd4209-kube-api-access-snnsg\") pod \"a18287f0-a719-4ea8-badd-3f2f13bd4209\" (UID: \"a18287f0-a719-4ea8-badd-3f2f13bd4209\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.142015 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9r8v\" (UniqueName: \"kubernetes.io/projected/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-kube-api-access-h9r8v\") pod \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.142950 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18287f0-a719-4ea8-badd-3f2f13bd4209-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a18287f0-a719-4ea8-badd-3f2f13bd4209" (UID: "a18287f0-a719-4ea8-badd-3f2f13bd4209"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143264 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474cfa15-2932-4b81-a00e-fc9c6648e91b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "474cfa15-2932-4b81-a00e-fc9c6648e91b" (UID: "474cfa15-2932-4b81-a00e-fc9c6648e91b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143336 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93745d9e-fb27-46b1-9305-de6265b0cc8d-operator-scripts\") pod \"93745d9e-fb27-46b1-9305-de6265b0cc8d\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143383 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcc971d1-4036-4338-80ba-8f4f00c10b2a-operator-scripts\") pod \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\" (UID: \"bcc971d1-4036-4338-80ba-8f4f00c10b2a\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143428 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-operator-scripts\") pod \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\" (UID: \"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143462 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwg2j\" (UniqueName: \"kubernetes.io/projected/93745d9e-fb27-46b1-9305-de6265b0cc8d-kube-api-access-cwg2j\") pod \"93745d9e-fb27-46b1-9305-de6265b0cc8d\" (UID: \"93745d9e-fb27-46b1-9305-de6265b0cc8d\") " Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143918 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18287f0-a719-4ea8-badd-3f2f13bd4209-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.143934 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/474cfa15-2932-4b81-a00e-fc9c6648e91b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.144560 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc971d1-4036-4338-80ba-8f4f00c10b2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcc971d1-4036-4338-80ba-8f4f00c10b2a" (UID: "bcc971d1-4036-4338-80ba-8f4f00c10b2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.144958 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" (UID: "dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.145041 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93745d9e-fb27-46b1-9305-de6265b0cc8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93745d9e-fb27-46b1-9305-de6265b0cc8d" (UID: "93745d9e-fb27-46b1-9305-de6265b0cc8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.152954 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474cfa15-2932-4b81-a00e-fc9c6648e91b-kube-api-access-gq5rj" (OuterVolumeSpecName: "kube-api-access-gq5rj") pod "474cfa15-2932-4b81-a00e-fc9c6648e91b" (UID: "474cfa15-2932-4b81-a00e-fc9c6648e91b"). InnerVolumeSpecName "kube-api-access-gq5rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.159139 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-kube-api-access-h9r8v" (OuterVolumeSpecName: "kube-api-access-h9r8v") pod "dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" (UID: "dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411"). InnerVolumeSpecName "kube-api-access-h9r8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.161009 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18287f0-a719-4ea8-badd-3f2f13bd4209-kube-api-access-snnsg" (OuterVolumeSpecName: "kube-api-access-snnsg") pod "a18287f0-a719-4ea8-badd-3f2f13bd4209" (UID: "a18287f0-a719-4ea8-badd-3f2f13bd4209"). InnerVolumeSpecName "kube-api-access-snnsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.161866 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93745d9e-fb27-46b1-9305-de6265b0cc8d-kube-api-access-cwg2j" (OuterVolumeSpecName: "kube-api-access-cwg2j") pod "93745d9e-fb27-46b1-9305-de6265b0cc8d" (UID: "93745d9e-fb27-46b1-9305-de6265b0cc8d"). InnerVolumeSpecName "kube-api-access-cwg2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.162311 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc971d1-4036-4338-80ba-8f4f00c10b2a-kube-api-access-xbknn" (OuterVolumeSpecName: "kube-api-access-xbknn") pod "bcc971d1-4036-4338-80ba-8f4f00c10b2a" (UID: "bcc971d1-4036-4338-80ba-8f4f00c10b2a"). InnerVolumeSpecName "kube-api-access-xbknn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.177047 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" event={"ID":"a18287f0-a719-4ea8-badd-3f2f13bd4209","Type":"ContainerDied","Data":"8fa10751f983ab0721443606ed204089ab93c750fc0102015037bff864cb0b64"} Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.177101 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa10751f983ab0721443606ed204089ab93c750fc0102015037bff864cb0b64" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.177178 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6ae5-account-create-update-7mtsv" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.187078 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-295ch" event={"ID":"bcc971d1-4036-4338-80ba-8f4f00c10b2a","Type":"ContainerDied","Data":"f19f1636ab9b2c07daa3d1f4fe8185cb8b183b5cfa8acc65d7139adfe335bd14"} Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.187127 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f19f1636ab9b2c07daa3d1f4fe8185cb8b183b5cfa8acc65d7139adfe335bd14" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.187209 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-295ch" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.191027 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe47-account-create-update-gnq8t" event={"ID":"dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411","Type":"ContainerDied","Data":"2fdd9903f74b77a1bb06ada02764fef5659ec913a8d61d39a1c4f5e23f258b51"} Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.191073 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fdd9903f74b77a1bb06ada02764fef5659ec913a8d61d39a1c4f5e23f258b51" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.191046 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe47-account-create-update-gnq8t" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.196304 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kb49" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.196364 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kb49" event={"ID":"93745d9e-fb27-46b1-9305-de6265b0cc8d","Type":"ContainerDied","Data":"068b2f6a922b56269ea7c9718d948de23081e7141c862895b5504c4eed5657d9"} Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.196413 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="068b2f6a922b56269ea7c9718d948de23081e7141c862895b5504c4eed5657d9" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.198555 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" event={"ID":"474cfa15-2932-4b81-a00e-fc9c6648e91b","Type":"ContainerDied","Data":"ef6e249b278006d6d0e9a5627d70534eb1685290cd1d8b08807ca8ebb38ecc67"} Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.198593 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef6e249b278006d6d0e9a5627d70534eb1685290cd1d8b08807ca8ebb38ecc67" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.198666 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fc-account-create-update-qwrrp" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.205474 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pkw66" event={"ID":"6d340bca-db9e-4748-9cad-c3856ffe6edf","Type":"ContainerDied","Data":"2a64f25bd14ea4e8ff5fabd7954a8e9c597c37ccbffc69f906b01c6cdc575553"} Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.205519 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pkw66" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.205526 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a64f25bd14ea4e8ff5fabd7954a8e9c597c37ccbffc69f906b01c6cdc575553" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245890 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9r8v\" (UniqueName: \"kubernetes.io/projected/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-kube-api-access-h9r8v\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245927 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93745d9e-fb27-46b1-9305-de6265b0cc8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245941 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcc971d1-4036-4338-80ba-8f4f00c10b2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245953 4675 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245967 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwg2j\" (UniqueName: \"kubernetes.io/projected/93745d9e-fb27-46b1-9305-de6265b0cc8d-kube-api-access-cwg2j\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245978 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq5rj\" (UniqueName: \"kubernetes.io/projected/474cfa15-2932-4b81-a00e-fc9c6648e91b-kube-api-access-gq5rj\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.245991 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbknn\" (UniqueName: \"kubernetes.io/projected/bcc971d1-4036-4338-80ba-8f4f00c10b2a-kube-api-access-xbknn\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.246003 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snnsg\" (UniqueName: \"kubernetes.io/projected/a18287f0-a719-4ea8-badd-3f2f13bd4209-kube-api-access-snnsg\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.485520 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:50 crc kubenswrapper[4675]: I0320 16:23:50.985050 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.166864 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-scripts\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167243 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-config-data\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167305 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-public-tls-certs\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167335 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-combined-ca-bundle\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167489 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnhz2\" (UniqueName: \"kubernetes.io/projected/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-kube-api-access-vnhz2\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167521 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-httpd-run\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167575 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-logs\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.167603 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\" (UID: \"d9de8c7f-b5e1-4222-ac09-24fa8e26b089\") " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.168009 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.168136 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-logs" (OuterVolumeSpecName: "logs") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.177889 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-scripts" (OuterVolumeSpecName: "scripts") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.178791 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.188008 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-kube-api-access-vnhz2" (OuterVolumeSpecName: "kube-api-access-vnhz2") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "kube-api-access-vnhz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.216919 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.236853 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerStarted","Data":"7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911"} Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.237087 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.237061 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-central-agent" containerID="cri-o://4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4" gracePeriod=30 Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.237193 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-notification-agent" containerID="cri-o://33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d" gracePeriod=30 Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.237230 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="sg-core" containerID="cri-o://9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536" gracePeriod=30 Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.237376 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="proxy-httpd" containerID="cri-o://7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911" gracePeriod=30 Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.240381 4675 generic.go:334] "Generic (PLEG): container finished" podID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerID="3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630" exitCode=0 Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.240420 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9de8c7f-b5e1-4222-ac09-24fa8e26b089","Type":"ContainerDied","Data":"3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630"} Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.240444 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9de8c7f-b5e1-4222-ac09-24fa8e26b089","Type":"ContainerDied","Data":"546bfc4aa6bc3c969b064fd16c9388c7fffb08c84df37c65b4ba469ea614034b"} Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.240462 4675 scope.go:117] "RemoveContainer" containerID="3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.240980 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.243027 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.244031 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-config-data" (OuterVolumeSpecName: "config-data") pod "d9de8c7f-b5e1-4222-ac09-24fa8e26b089" (UID: "d9de8c7f-b5e1-4222-ac09-24fa8e26b089"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.263168 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7751944069999999 podStartE2EDuration="8.263147449s" podCreationTimestamp="2026-03-20 16:23:43 +0000 UTC" firstStartedPulling="2026-03-20 16:23:44.045350579 +0000 UTC m=+1344.078980116" lastFinishedPulling="2026-03-20 16:23:50.533303621 +0000 UTC m=+1350.566933158" observedRunningTime="2026-03-20 16:23:51.262298985 +0000 UTC m=+1351.295928522" watchObservedRunningTime="2026-03-20 16:23:51.263147449 +0000 UTC m=+1351.296776986" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269558 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269587 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269596 4675 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269606 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269614 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnhz2\" (UniqueName: \"kubernetes.io/projected/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-kube-api-access-vnhz2\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269622 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269630 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de8c7f-b5e1-4222-ac09-24fa8e26b089-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.269659 4675 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.285386 4675 scope.go:117] "RemoveContainer" containerID="03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.300006 4675 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.311628 4675 scope.go:117] "RemoveContainer" containerID="3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.312209 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630\": container with ID starting with 3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630 not found: ID does not exist" containerID="3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.312286 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630"} err="failed to get container status \"3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630\": rpc error: code = NotFound desc = could not find container \"3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630\": container with ID starting with 3aa2ecd4f7d9e4fe3e35c22163aa229eef9f4e94b5d1b491199a15da451fb630 not found: ID does not exist" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.312319 4675 scope.go:117] "RemoveContainer" containerID="03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.312759 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd\": container with ID starting with 03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd not found: ID does not exist" containerID="03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.312803 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd"} err="failed to get container status \"03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd\": rpc error: code = NotFound desc = could not find container \"03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd\": container with ID starting with 03b0a472983aec7e1feffdf0227f68e691c8a3597d0a4c7cad78abcdcf2d6fbd not found: ID does not exist" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.371340 4675 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.600609 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.608170 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.633590 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.634024 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-api" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.634046 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-api" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.634060 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-httpd" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.634068 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-httpd" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.634079 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-httpd" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.634086 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-httpd" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.634110 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18287f0-a719-4ea8-badd-3f2f13bd4209" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.634118 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18287f0-a719-4ea8-badd-3f2f13bd4209" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.634137 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-log" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.634145 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-log" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.634158 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d340bca-db9e-4748-9cad-c3856ffe6edf" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.634165 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d340bca-db9e-4748-9cad-c3856ffe6edf" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.638822 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.638847 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.638865 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474cfa15-2932-4b81-a00e-fc9c6648e91b" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.638873 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="474cfa15-2932-4b81-a00e-fc9c6648e91b" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.638884 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc971d1-4036-4338-80ba-8f4f00c10b2a" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.638894 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc971d1-4036-4338-80ba-8f4f00c10b2a" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: E0320 16:23:51.638904 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93745d9e-fb27-46b1-9305-de6265b0cc8d" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.638911 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="93745d9e-fb27-46b1-9305-de6265b0cc8d" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639196 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc971d1-4036-4338-80ba-8f4f00c10b2a" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639212 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18287f0-a719-4ea8-badd-3f2f13bd4209" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639229 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-log" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639239 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="474cfa15-2932-4b81-a00e-fc9c6648e91b" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639255 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="93745d9e-fb27-46b1-9305-de6265b0cc8d" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639272 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" containerName="mariadb-account-create-update" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639286 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-httpd" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639302 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" containerName="glance-httpd" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639315 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="05eec34e-bcd5-4ba6-ad16-bacbf5cb4b58" containerName="neutron-api" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.639328 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d340bca-db9e-4748-9cad-c3856ffe6edf" containerName="mariadb-database-create" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.640424 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.645208 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.645411 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.653993 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.783360 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-config-data\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.783892 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-scripts\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.784659 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9de0228-878e-4311-8146-93fdff40b851-logs\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.784697 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdh2\" (UniqueName: \"kubernetes.io/projected/f9de0228-878e-4311-8146-93fdff40b851-kube-api-access-9wdh2\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.784815 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.784864 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.784898 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.784929 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9de0228-878e-4311-8146-93fdff40b851-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.886739 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9de0228-878e-4311-8146-93fdff40b851-logs\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887035 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdh2\" (UniqueName: \"kubernetes.io/projected/f9de0228-878e-4311-8146-93fdff40b851-kube-api-access-9wdh2\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887102 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887159 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887189 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887380 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9de0228-878e-4311-8146-93fdff40b851-logs\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887758 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.887792 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9de0228-878e-4311-8146-93fdff40b851-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.888187 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-config-data\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.888280 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-scripts\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.888053 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9de0228-878e-4311-8146-93fdff40b851-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.892127 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.904175 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.905933 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-config-data\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.908602 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9de0228-878e-4311-8146-93fdff40b851-scripts\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.912256 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdh2\" (UniqueName: \"kubernetes.io/projected/f9de0228-878e-4311-8146-93fdff40b851-kube-api-access-9wdh2\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:51 crc kubenswrapper[4675]: I0320 16:23:51.927043 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f9de0228-878e-4311-8146-93fdff40b851\") " pod="openstack/glance-default-external-api-0" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.003141 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.320045 4675 generic.go:334] "Generic (PLEG): container finished" podID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerID="7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911" exitCode=0 Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.320354 4675 generic.go:334] "Generic (PLEG): container finished" podID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerID="9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536" exitCode=2 Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.320363 4675 generic.go:334] "Generic (PLEG): container finished" podID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerID="33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d" exitCode=0 Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.320431 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerDied","Data":"7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911"} Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.320459 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerDied","Data":"9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536"} Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.320471 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerDied","Data":"33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d"} Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.379817 4675 generic.go:334] "Generic (PLEG): container finished" podID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerID="d866f3970e9e9bc778662ccdfa3abf3d53ca0536f21f70f0a0b5c984d37bae6f" exitCode=0 Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.379869 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4f4c66-7029-49ad-aa71-38faa62d3178","Type":"ContainerDied","Data":"d866f3970e9e9bc778662ccdfa3abf3d53ca0536f21f70f0a0b5c984d37bae6f"} Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.448865 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607487 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607560 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-httpd-run\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-internal-tls-certs\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607676 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-scripts\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607714 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db7g4\" (UniqueName: \"kubernetes.io/projected/bb4f4c66-7029-49ad-aa71-38faa62d3178-kube-api-access-db7g4\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607764 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-config-data\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607814 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-logs\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.607842 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-combined-ca-bundle\") pod \"bb4f4c66-7029-49ad-aa71-38faa62d3178\" (UID: \"bb4f4c66-7029-49ad-aa71-38faa62d3178\") " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.608260 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-logs" (OuterVolumeSpecName: "logs") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.608538 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.613230 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-scripts" (OuterVolumeSpecName: "scripts") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.613389 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.616617 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4f4c66-7029-49ad-aa71-38faa62d3178-kube-api-access-db7g4" (OuterVolumeSpecName: "kube-api-access-db7g4") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "kube-api-access-db7g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.679380 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.681681 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-config-data" (OuterVolumeSpecName: "config-data") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.692115 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4f4c66-7029-49ad-aa71-38faa62d3178" (UID: "bb4f4c66-7029-49ad-aa71-38faa62d3178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.692561 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9de8c7f-b5e1-4222-ac09-24fa8e26b089" path="/var/lib/kubelet/pods/d9de8c7f-b5e1-4222-ac09-24fa8e26b089/volumes" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710050 4675 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710083 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710093 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db7g4\" (UniqueName: \"kubernetes.io/projected/bb4f4c66-7029-49ad-aa71-38faa62d3178-kube-api-access-db7g4\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710103 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710111 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710118 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4f4c66-7029-49ad-aa71-38faa62d3178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710740 4675 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.710754 4675 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4f4c66-7029-49ad-aa71-38faa62d3178-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.738096 4675 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.749545 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:23:52 crc kubenswrapper[4675]: I0320 16:23:52.812694 4675 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.389582 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9de0228-878e-4311-8146-93fdff40b851","Type":"ContainerStarted","Data":"af822a25baee602648057bba1ee6b865c7b6e4c5756b2c3d5631255637466bae"} Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.389871 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9de0228-878e-4311-8146-93fdff40b851","Type":"ContainerStarted","Data":"29817237f1cb79bd956227e6360c90b48fb0e89d66e8eabb731be2de60a7d24b"} Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.391348 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4f4c66-7029-49ad-aa71-38faa62d3178","Type":"ContainerDied","Data":"e6c224423a1ca91154a2184b308505a5d0694fe730fc9c6a3fa51e68bbbdd178"} Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.391397 4675 scope.go:117] "RemoveContainer" containerID="d866f3970e9e9bc778662ccdfa3abf3d53ca0536f21f70f0a0b5c984d37bae6f" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.391481 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.417391 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.434017 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.449988 4675 scope.go:117] "RemoveContainer" containerID="ab5b58b6bf2e1a2e52e3f98777d2d6b997308d88bcc779a5ba96a6dda56d377e" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.452298 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:23:53 crc kubenswrapper[4675]: E0320 16:23:53.452742 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-log" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.452759 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-log" Mar 20 16:23:53 crc kubenswrapper[4675]: E0320 16:23:53.452814 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-httpd" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.452820 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-httpd" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.452989 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-log" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.453011 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" containerName="glance-httpd" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.453892 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.456867 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.457117 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.469615 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633054 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be7b0e7-cc04-4551-9775-b231792b3e25-logs\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633103 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633134 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633172 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4be7b0e7-cc04-4551-9775-b231792b3e25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633206 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633237 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtff\" (UniqueName: \"kubernetes.io/projected/4be7b0e7-cc04-4551-9775-b231792b3e25-kube-api-access-tgtff\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633259 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.633301 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.735249 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4be7b0e7-cc04-4551-9775-b231792b3e25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.735344 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.735396 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtff\" (UniqueName: \"kubernetes.io/projected/4be7b0e7-cc04-4551-9775-b231792b3e25-kube-api-access-tgtff\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.735449 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.735505 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.736168 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be7b0e7-cc04-4551-9775-b231792b3e25-logs\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.736207 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.736239 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.736910 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4be7b0e7-cc04-4551-9775-b231792b3e25-logs\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.737013 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.737099 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4be7b0e7-cc04-4551-9775-b231792b3e25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.740445 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.742755 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.745404 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.753281 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be7b0e7-cc04-4551-9775-b231792b3e25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.754434 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtff\" (UniqueName: \"kubernetes.io/projected/4be7b0e7-cc04-4551-9775-b231792b3e25-kube-api-access-tgtff\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:53 crc kubenswrapper[4675]: I0320 16:23:53.768629 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4be7b0e7-cc04-4551-9775-b231792b3e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:23:54 crc kubenswrapper[4675]: I0320 16:23:54.071069 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:54 crc kubenswrapper[4675]: I0320 16:23:54.405943 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9de0228-878e-4311-8146-93fdff40b851","Type":"ContainerStarted","Data":"0056b88180688ca44f3937bb0da66cb45f143b8b30fac98a5bff89c8d8cb0ecb"} Mar 20 16:23:54 crc kubenswrapper[4675]: I0320 16:23:54.435334 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.4353153020000002 podStartE2EDuration="3.435315302s" podCreationTimestamp="2026-03-20 16:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:54.434288833 +0000 UTC m=+1354.467918370" watchObservedRunningTime="2026-03-20 16:23:54.435315302 +0000 UTC m=+1354.468944839" Mar 20 16:23:54 crc kubenswrapper[4675]: W0320 16:23:54.701692 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4be7b0e7_cc04_4551_9775_b231792b3e25.slice/crio-923ae0e2bda0358773e9ad38127cb459d5f88f49046a53e9377e201ec1403788 WatchSource:0}: Error finding container 923ae0e2bda0358773e9ad38127cb459d5f88f49046a53e9377e201ec1403788: Status 404 returned error can't find the container with id 923ae0e2bda0358773e9ad38127cb459d5f88f49046a53e9377e201ec1403788 Mar 20 16:23:54 crc kubenswrapper[4675]: I0320 16:23:54.708179 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4f4c66-7029-49ad-aa71-38faa62d3178" path="/var/lib/kubelet/pods/bb4f4c66-7029-49ad-aa71-38faa62d3178/volumes" Mar 20 16:23:54 crc kubenswrapper[4675]: I0320 16:23:54.708933 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.255997 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cvcgw"] Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.264039 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.271107 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.272626 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.284406 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gfrf2" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.328847 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cvcgw"] Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.379990 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gl4\" (UniqueName: \"kubernetes.io/projected/e5eba599-99e1-4899-8ae7-0ba38e60724b-kube-api-access-w6gl4\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.380349 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.380390 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-scripts\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.380425 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-config-data\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.438697 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4be7b0e7-cc04-4551-9775-b231792b3e25","Type":"ContainerStarted","Data":"fdd29bdb955095a49dbff157121d4c72d18952f24292775f5c8f0e6b028331fa"} Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.438750 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4be7b0e7-cc04-4551-9775-b231792b3e25","Type":"ContainerStarted","Data":"923ae0e2bda0358773e9ad38127cb459d5f88f49046a53e9377e201ec1403788"} Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.490265 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gl4\" (UniqueName: \"kubernetes.io/projected/e5eba599-99e1-4899-8ae7-0ba38e60724b-kube-api-access-w6gl4\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.490322 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.490367 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-scripts\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.490397 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-config-data\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.495584 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.501968 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-scripts\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.503551 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-config-data\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.512180 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gl4\" (UniqueName: \"kubernetes.io/projected/e5eba599-99e1-4899-8ae7-0ba38e60724b-kube-api-access-w6gl4\") pod \"nova-cell0-conductor-db-sync-cvcgw\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:55 crc kubenswrapper[4675]: I0320 16:23:55.645331 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:23:56 crc kubenswrapper[4675]: I0320 16:23:56.116998 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cvcgw"] Mar 20 16:23:56 crc kubenswrapper[4675]: I0320 16:23:56.447991 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4be7b0e7-cc04-4551-9775-b231792b3e25","Type":"ContainerStarted","Data":"d815bf0bd952ab2f683cc2385efc4ff4cf8a8aca487b3d17b8d062bd458fb7b1"} Mar 20 16:23:56 crc kubenswrapper[4675]: I0320 16:23:56.449416 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" event={"ID":"e5eba599-99e1-4899-8ae7-0ba38e60724b","Type":"ContainerStarted","Data":"acfea37de0b66495ee6daf2edf8a864d9f10007efbd5e7181fdc2bcda5604d43"} Mar 20 16:23:56 crc kubenswrapper[4675]: I0320 16:23:56.478175 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.478150883 podStartE2EDuration="3.478150883s" podCreationTimestamp="2026-03-20 16:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:56.465784818 +0000 UTC m=+1356.499414365" watchObservedRunningTime="2026-03-20 16:23:56.478150883 +0000 UTC m=+1356.511780420" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.295042 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.324730 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-scripts\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.324831 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-run-httpd\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.324891 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-log-httpd\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.324922 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-sg-core-conf-yaml\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.324965 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkgp8\" (UniqueName: \"kubernetes.io/projected/3df76c2e-6431-42a0-a702-6a9a9987f8c9-kube-api-access-kkgp8\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.325010 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-combined-ca-bundle\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.325064 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-config-data\") pod \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\" (UID: \"3df76c2e-6431-42a0-a702-6a9a9987f8c9\") " Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.325736 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.328554 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.331878 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-scripts" (OuterVolumeSpecName: "scripts") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.346635 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df76c2e-6431-42a0-a702-6a9a9987f8c9-kube-api-access-kkgp8" (OuterVolumeSpecName: "kube-api-access-kkgp8") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "kube-api-access-kkgp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.371203 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.427658 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.427700 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.427714 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkgp8\" (UniqueName: \"kubernetes.io/projected/3df76c2e-6431-42a0-a702-6a9a9987f8c9-kube-api-access-kkgp8\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.427725 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.427737 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3df76c2e-6431-42a0-a702-6a9a9987f8c9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.430575 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.437720 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-config-data" (OuterVolumeSpecName: "config-data") pod "3df76c2e-6431-42a0-a702-6a9a9987f8c9" (UID: "3df76c2e-6431-42a0-a702-6a9a9987f8c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.462055 4675 generic.go:334] "Generic (PLEG): container finished" podID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerID="4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4" exitCode=0 Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.462926 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.463399 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerDied","Data":"4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4"} Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.463471 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3df76c2e-6431-42a0-a702-6a9a9987f8c9","Type":"ContainerDied","Data":"c34fc647a4792150b09e4f3f1419508d3efa9f61fb4f93fdab670eda62811976"} Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.463492 4675 scope.go:117] "RemoveContainer" containerID="7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.495364 4675 scope.go:117] "RemoveContainer" containerID="9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.518063 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.525279 4675 scope.go:117] "RemoveContainer" containerID="33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.529231 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.529253 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df76c2e-6431-42a0-a702-6a9a9987f8c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.532620 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.550871 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.551244 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-notification-agent" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551262 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-notification-agent" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.551288 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="proxy-httpd" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551294 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="proxy-httpd" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.551313 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="sg-core" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551319 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="sg-core" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.551334 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-central-agent" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551339 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-central-agent" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551545 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-central-agent" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551571 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="ceilometer-notification-agent" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551583 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="sg-core" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.551592 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" containerName="proxy-httpd" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.553178 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.555289 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.555581 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.563698 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.576164 4675 scope.go:117] "RemoveContainer" containerID="4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.600785 4675 scope.go:117] "RemoveContainer" containerID="7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.601284 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911\": container with ID starting with 7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911 not found: ID does not exist" containerID="7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.601321 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911"} err="failed to get container status \"7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911\": rpc error: code = NotFound desc = could not find container \"7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911\": container with ID starting with 7405b86d1a24be8eb577760487f97fe6f460796632e9311eadc8de0abf8a8911 not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.601341 4675 scope.go:117] "RemoveContainer" containerID="9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.601698 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536\": container with ID starting with 9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536 not found: ID does not exist" containerID="9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.601720 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536"} err="failed to get container status \"9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536\": rpc error: code = NotFound desc = could not find container \"9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536\": container with ID starting with 9c5e90b27adb2317f694011d23752220007edb98140628e53ecdcb1ae297b536 not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.601733 4675 scope.go:117] "RemoveContainer" containerID="33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.602088 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d\": container with ID starting with 33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d not found: ID does not exist" containerID="33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.602112 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d"} err="failed to get container status \"33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d\": rpc error: code = NotFound desc = could not find container \"33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d\": container with ID starting with 33333b7c632032f95a2f41cc63b7137de3a9e58c02c3d9dc1981993b455ba19d not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.602126 4675 scope.go:117] "RemoveContainer" containerID="4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4" Mar 20 16:23:57 crc kubenswrapper[4675]: E0320 16:23:57.602425 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4\": container with ID starting with 4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4 not found: ID does not exist" containerID="4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.602472 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4"} err="failed to get container status \"4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4\": rpc error: code = NotFound desc = could not find container \"4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4\": container with ID starting with 4bc978041ccf515ccd9038e12a2608d32a32010a5d6a1885c1647860797924e4 not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.630995 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.631040 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-config-data\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.631087 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-run-httpd\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.631113 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.631185 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vth\" (UniqueName: \"kubernetes.io/projected/a6ab0c9f-e49c-4456-8515-9b4363d38531-kube-api-access-t5vth\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.631216 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-log-httpd\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.631271 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-scripts\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733003 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733050 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-config-data\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733097 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-run-httpd\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733121 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733155 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vth\" (UniqueName: \"kubernetes.io/projected/a6ab0c9f-e49c-4456-8515-9b4363d38531-kube-api-access-t5vth\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733181 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-log-httpd\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.733234 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-scripts\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.737390 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-run-httpd\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.740214 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-log-httpd\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.741524 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.742169 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.755678 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-scripts\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.762959 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-config-data\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.762954 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vth\" (UniqueName: \"kubernetes.io/projected/a6ab0c9f-e49c-4456-8515-9b4363d38531-kube-api-access-t5vth\") pod \"ceilometer-0\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4675]: I0320 16:23:57.876838 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:58 crc kubenswrapper[4675]: I0320 16:23:58.311254 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:58 crc kubenswrapper[4675]: I0320 16:23:58.474756 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerStarted","Data":"8d3b7807e94420a08ab36dee5fd238f6c781011bac24378d78f488f3c78d1b99"} Mar 20 16:23:58 crc kubenswrapper[4675]: I0320 16:23:58.684907 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df76c2e-6431-42a0-a702-6a9a9987f8c9" path="/var/lib/kubelet/pods/3df76c2e-6431-42a0-a702-6a9a9987f8c9/volumes" Mar 20 16:23:59 crc kubenswrapper[4675]: I0320 16:23:59.484866 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerStarted","Data":"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201"} Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.134308 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567064-dm796"] Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.135754 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.138871 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.139178 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.139224 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.148350 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-dm796"] Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.178722 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbz9d\" (UniqueName: \"kubernetes.io/projected/7b15ecc4-edbe-4833-9118-7c6a4c7b3352-kube-api-access-cbz9d\") pod \"auto-csr-approver-29567064-dm796\" (UID: \"7b15ecc4-edbe-4833-9118-7c6a4c7b3352\") " pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.280332 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbz9d\" (UniqueName: \"kubernetes.io/projected/7b15ecc4-edbe-4833-9118-7c6a4c7b3352-kube-api-access-cbz9d\") pod \"auto-csr-approver-29567064-dm796\" (UID: \"7b15ecc4-edbe-4833-9118-7c6a4c7b3352\") " pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.300508 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbz9d\" (UniqueName: \"kubernetes.io/projected/7b15ecc4-edbe-4833-9118-7c6a4c7b3352-kube-api-access-cbz9d\") pod \"auto-csr-approver-29567064-dm796\" (UID: \"7b15ecc4-edbe-4833-9118-7c6a4c7b3352\") " pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:00 crc kubenswrapper[4675]: I0320 16:24:00.485879 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:02 crc kubenswrapper[4675]: I0320 16:24:02.003810 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:24:02 crc kubenswrapper[4675]: I0320 16:24:02.006287 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:24:02 crc kubenswrapper[4675]: I0320 16:24:02.048095 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:24:02 crc kubenswrapper[4675]: I0320 16:24:02.051161 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:24:02 crc kubenswrapper[4675]: I0320 16:24:02.513025 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:24:02 crc kubenswrapper[4675]: I0320 16:24:02.513071 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:24:03 crc kubenswrapper[4675]: I0320 16:24:03.527086 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" event={"ID":"e5eba599-99e1-4899-8ae7-0ba38e60724b","Type":"ContainerStarted","Data":"c939029382189ad85b94113049399625b490c4c552687aac5df5d72f3dbe945e"} Mar 20 16:24:03 crc kubenswrapper[4675]: I0320 16:24:03.548099 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" podStartSLOduration=1.411065436 podStartE2EDuration="8.548078408s" podCreationTimestamp="2026-03-20 16:23:55 +0000 UTC" firstStartedPulling="2026-03-20 16:23:56.129722158 +0000 UTC m=+1356.163351695" lastFinishedPulling="2026-03-20 16:24:03.26673513 +0000 UTC m=+1363.300364667" observedRunningTime="2026-03-20 16:24:03.544605622 +0000 UTC m=+1363.578235159" watchObservedRunningTime="2026-03-20 16:24:03.548078408 +0000 UTC m=+1363.581707945" Mar 20 16:24:03 crc kubenswrapper[4675]: I0320 16:24:03.619566 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-dm796"] Mar 20 16:24:03 crc kubenswrapper[4675]: W0320 16:24:03.622353 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b15ecc4_edbe_4833_9118_7c6a4c7b3352.slice/crio-363a1bfb64d73e8cdfdce1f2035df1d8ff86738ffc490e079c3e32bbc193bb5d WatchSource:0}: Error finding container 363a1bfb64d73e8cdfdce1f2035df1d8ff86738ffc490e079c3e32bbc193bb5d: Status 404 returned error can't find the container with id 363a1bfb64d73e8cdfdce1f2035df1d8ff86738ffc490e079c3e32bbc193bb5d Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.071452 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.071836 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.127902 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.139306 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.424781 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.425106 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.536902 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerStarted","Data":"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b"} Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.536954 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerStarted","Data":"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02"} Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.539361 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-dm796" event={"ID":"7b15ecc4-edbe-4833-9118-7c6a4c7b3352","Type":"ContainerStarted","Data":"363a1bfb64d73e8cdfdce1f2035df1d8ff86738ffc490e079c3e32bbc193bb5d"} Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.541108 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.541140 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.648428 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.648519 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:24:04 crc kubenswrapper[4675]: I0320 16:24:04.719631 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:24:05 crc kubenswrapper[4675]: E0320 16:24:05.508371 4675 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b15ecc4_edbe_4833_9118_7c6a4c7b3352.slice/crio-1bb17a3da7b17bfdae61a39990418f2e8995872083cb61e6b53e6302d30d2242.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b15ecc4_edbe_4833_9118_7c6a4c7b3352.slice/crio-conmon-1bb17a3da7b17bfdae61a39990418f2e8995872083cb61e6b53e6302d30d2242.scope\": RecentStats: unable to find data in memory cache]" Mar 20 16:24:05 crc kubenswrapper[4675]: I0320 16:24:05.552011 4675 generic.go:334] "Generic (PLEG): container finished" podID="7b15ecc4-edbe-4833-9118-7c6a4c7b3352" containerID="1bb17a3da7b17bfdae61a39990418f2e8995872083cb61e6b53e6302d30d2242" exitCode=0 Mar 20 16:24:05 crc kubenswrapper[4675]: I0320 16:24:05.552150 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-dm796" event={"ID":"7b15ecc4-edbe-4833-9118-7c6a4c7b3352","Type":"ContainerDied","Data":"1bb17a3da7b17bfdae61a39990418f2e8995872083cb61e6b53e6302d30d2242"} Mar 20 16:24:06 crc kubenswrapper[4675]: I0320 16:24:06.565380 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerStarted","Data":"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7"} Mar 20 16:24:06 crc kubenswrapper[4675]: I0320 16:24:06.565417 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:24:06 crc kubenswrapper[4675]: I0320 16:24:06.565718 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:24:06 crc kubenswrapper[4675]: I0320 16:24:06.566215 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:24:06 crc kubenswrapper[4675]: I0320 16:24:06.603395 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.993053135 podStartE2EDuration="9.603374961s" podCreationTimestamp="2026-03-20 16:23:57 +0000 UTC" firstStartedPulling="2026-03-20 16:23:58.326949993 +0000 UTC m=+1358.360579540" lastFinishedPulling="2026-03-20 16:24:05.937271829 +0000 UTC m=+1365.970901366" observedRunningTime="2026-03-20 16:24:06.596878397 +0000 UTC m=+1366.630507954" watchObservedRunningTime="2026-03-20 16:24:06.603374961 +0000 UTC m=+1366.637004498" Mar 20 16:24:06 crc kubenswrapper[4675]: I0320 16:24:06.744980 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.003175 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.006432 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.138613 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbz9d\" (UniqueName: \"kubernetes.io/projected/7b15ecc4-edbe-4833-9118-7c6a4c7b3352-kube-api-access-cbz9d\") pod \"7b15ecc4-edbe-4833-9118-7c6a4c7b3352\" (UID: \"7b15ecc4-edbe-4833-9118-7c6a4c7b3352\") " Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.144389 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b15ecc4-edbe-4833-9118-7c6a4c7b3352-kube-api-access-cbz9d" (OuterVolumeSpecName: "kube-api-access-cbz9d") pod "7b15ecc4-edbe-4833-9118-7c6a4c7b3352" (UID: "7b15ecc4-edbe-4833-9118-7c6a4c7b3352"). InnerVolumeSpecName "kube-api-access-cbz9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.150407 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.240741 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbz9d\" (UniqueName: \"kubernetes.io/projected/7b15ecc4-edbe-4833-9118-7c6a4c7b3352-kube-api-access-cbz9d\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.577490 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-dm796" event={"ID":"7b15ecc4-edbe-4833-9118-7c6a4c7b3352","Type":"ContainerDied","Data":"363a1bfb64d73e8cdfdce1f2035df1d8ff86738ffc490e079c3e32bbc193bb5d"} Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.577539 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="363a1bfb64d73e8cdfdce1f2035df1d8ff86738ffc490e079c3e32bbc193bb5d" Mar 20 16:24:07 crc kubenswrapper[4675]: I0320 16:24:07.578608 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-dm796" Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.104159 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-nfwrp"] Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.114258 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-nfwrp"] Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.586349 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-central-agent" containerID="cri-o://edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" gracePeriod=30 Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.586392 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="sg-core" containerID="cri-o://35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" gracePeriod=30 Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.586392 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="proxy-httpd" containerID="cri-o://63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" gracePeriod=30 Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.586478 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-notification-agent" containerID="cri-o://44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" gracePeriod=30 Mar 20 16:24:08 crc kubenswrapper[4675]: I0320 16:24:08.684533 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b738426f-49bb-4a73-b55f-c4840a67a7d5" path="/var/lib/kubelet/pods/b738426f-49bb-4a73-b55f-c4840a67a7d5/volumes" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.339244 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.378821 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-log-httpd\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.378891 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5vth\" (UniqueName: \"kubernetes.io/projected/a6ab0c9f-e49c-4456-8515-9b4363d38531-kube-api-access-t5vth\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.378951 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-sg-core-conf-yaml\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.378998 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-scripts\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.379031 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-combined-ca-bundle\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.379087 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-run-httpd\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.379176 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-config-data\") pod \"a6ab0c9f-e49c-4456-8515-9b4363d38531\" (UID: \"a6ab0c9f-e49c-4456-8515-9b4363d38531\") " Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.380160 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.380463 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.389900 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ab0c9f-e49c-4456-8515-9b4363d38531-kube-api-access-t5vth" (OuterVolumeSpecName: "kube-api-access-t5vth") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "kube-api-access-t5vth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.390000 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-scripts" (OuterVolumeSpecName: "scripts") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.410280 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.458946 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.481195 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.481239 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5vth\" (UniqueName: \"kubernetes.io/projected/a6ab0c9f-e49c-4456-8515-9b4363d38531-kube-api-access-t5vth\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.481255 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.481267 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.481280 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.481294 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6ab0c9f-e49c-4456-8515-9b4363d38531-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.486142 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-config-data" (OuterVolumeSpecName: "config-data") pod "a6ab0c9f-e49c-4456-8515-9b4363d38531" (UID: "a6ab0c9f-e49c-4456-8515-9b4363d38531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.584852 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ab0c9f-e49c-4456-8515-9b4363d38531-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601173 4675 generic.go:334] "Generic (PLEG): container finished" podID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" exitCode=0 Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601207 4675 generic.go:334] "Generic (PLEG): container finished" podID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" exitCode=2 Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601216 4675 generic.go:334] "Generic (PLEG): container finished" podID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" exitCode=0 Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601227 4675 generic.go:334] "Generic (PLEG): container finished" podID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" exitCode=0 Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601246 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerDied","Data":"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7"} Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601247 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601286 4675 scope.go:117] "RemoveContainer" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601272 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerDied","Data":"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b"} Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601466 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerDied","Data":"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02"} Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601481 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerDied","Data":"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201"} Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.601494 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6ab0c9f-e49c-4456-8515-9b4363d38531","Type":"ContainerDied","Data":"8d3b7807e94420a08ab36dee5fd238f6c781011bac24378d78f488f3c78d1b99"} Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.655018 4675 scope.go:117] "RemoveContainer" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.664452 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.680185 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.687477 4675 scope.go:117] "RemoveContainer" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.694953 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.695349 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-notification-agent" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695367 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-notification-agent" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.695383 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="sg-core" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695390 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="sg-core" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.695405 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-central-agent" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695412 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-central-agent" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.695446 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b15ecc4-edbe-4833-9118-7c6a4c7b3352" containerName="oc" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695452 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b15ecc4-edbe-4833-9118-7c6a4c7b3352" containerName="oc" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.695464 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="proxy-httpd" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695470 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="proxy-httpd" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695657 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-central-agent" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695680 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b15ecc4-edbe-4833-9118-7c6a4c7b3352" containerName="oc" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695695 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="sg-core" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695705 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="ceilometer-notification-agent" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.695719 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" containerName="proxy-httpd" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.697246 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.706546 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.712605 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.724142 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.782817 4675 scope.go:117] "RemoveContainer" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791000 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-scripts\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791208 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791364 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-log-httpd\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791432 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krm4\" (UniqueName: \"kubernetes.io/projected/79f846df-cb1b-4441-b985-95d4b76be0fa-kube-api-access-9krm4\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791521 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791579 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-run-httpd\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.791602 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-config-data\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.800924 4675 scope.go:117] "RemoveContainer" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.801309 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": container with ID starting with 63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7 not found: ID does not exist" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.801351 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7"} err="failed to get container status \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": rpc error: code = NotFound desc = could not find container \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": container with ID starting with 63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.801372 4675 scope.go:117] "RemoveContainer" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.801663 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": container with ID starting with 35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b not found: ID does not exist" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.801703 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b"} err="failed to get container status \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": rpc error: code = NotFound desc = could not find container \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": container with ID starting with 35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.801747 4675 scope.go:117] "RemoveContainer" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.802109 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": container with ID starting with 44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02 not found: ID does not exist" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.802150 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02"} err="failed to get container status \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": rpc error: code = NotFound desc = could not find container \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": container with ID starting with 44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.802177 4675 scope.go:117] "RemoveContainer" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" Mar 20 16:24:09 crc kubenswrapper[4675]: E0320 16:24:09.802479 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": container with ID starting with edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201 not found: ID does not exist" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.802514 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201"} err="failed to get container status \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": rpc error: code = NotFound desc = could not find container \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": container with ID starting with edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.802591 4675 scope.go:117] "RemoveContainer" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.802894 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7"} err="failed to get container status \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": rpc error: code = NotFound desc = could not find container \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": container with ID starting with 63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.802926 4675 scope.go:117] "RemoveContainer" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803095 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b"} err="failed to get container status \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": rpc error: code = NotFound desc = could not find container \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": container with ID starting with 35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803122 4675 scope.go:117] "RemoveContainer" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803330 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02"} err="failed to get container status \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": rpc error: code = NotFound desc = could not find container \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": container with ID starting with 44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803353 4675 scope.go:117] "RemoveContainer" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803561 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201"} err="failed to get container status \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": rpc error: code = NotFound desc = could not find container \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": container with ID starting with edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803583 4675 scope.go:117] "RemoveContainer" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803849 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7"} err="failed to get container status \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": rpc error: code = NotFound desc = could not find container \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": container with ID starting with 63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.803909 4675 scope.go:117] "RemoveContainer" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.804146 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b"} err="failed to get container status \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": rpc error: code = NotFound desc = could not find container \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": container with ID starting with 35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.804171 4675 scope.go:117] "RemoveContainer" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.804425 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02"} err="failed to get container status \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": rpc error: code = NotFound desc = could not find container \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": container with ID starting with 44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.804450 4675 scope.go:117] "RemoveContainer" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.804790 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201"} err="failed to get container status \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": rpc error: code = NotFound desc = could not find container \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": container with ID starting with edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.804813 4675 scope.go:117] "RemoveContainer" containerID="63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805033 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7"} err="failed to get container status \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": rpc error: code = NotFound desc = could not find container \"63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7\": container with ID starting with 63db68f6d5d36f3af756a68251d5720df81ce363481791349b2b0c35e683bbf7 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805056 4675 scope.go:117] "RemoveContainer" containerID="35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805223 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b"} err="failed to get container status \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": rpc error: code = NotFound desc = could not find container \"35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b\": container with ID starting with 35fa280c029f6d9cccc7e9563f73e6723bec17ca6e283c71156b716e7b4fad9b not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805242 4675 scope.go:117] "RemoveContainer" containerID="44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805437 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02"} err="failed to get container status \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": rpc error: code = NotFound desc = could not find container \"44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02\": container with ID starting with 44225660ca0eb8261b3426839eb3e185d6645c03f24dab62409f24c96c17ff02 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805475 4675 scope.go:117] "RemoveContainer" containerID="edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.805863 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201"} err="failed to get container status \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": rpc error: code = NotFound desc = could not find container \"edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201\": container with ID starting with edc67482a7f4f7b1fcb0f5ab594b2d8ceb1a74f999d58d3876f7c26928a91201 not found: ID does not exist" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.892833 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-log-httpd\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.893096 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krm4\" (UniqueName: \"kubernetes.io/projected/79f846df-cb1b-4441-b985-95d4b76be0fa-kube-api-access-9krm4\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.893135 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.893162 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-config-data\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.893177 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-run-httpd\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.893204 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-scripts\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.893618 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-run-httpd\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.894085 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-log-httpd\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.894403 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.897556 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.897761 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-scripts\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.897839 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-config-data\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.898420 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:09 crc kubenswrapper[4675]: I0320 16:24:09.920003 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krm4\" (UniqueName: \"kubernetes.io/projected/79f846df-cb1b-4441-b985-95d4b76be0fa-kube-api-access-9krm4\") pod \"ceilometer-0\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " pod="openstack/ceilometer-0" Mar 20 16:24:10 crc kubenswrapper[4675]: I0320 16:24:10.069728 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:10 crc kubenswrapper[4675]: I0320 16:24:10.582068 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:10 crc kubenswrapper[4675]: I0320 16:24:10.611726 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerStarted","Data":"69ce684bb7e82c664c02261c096634e9a403802db47247d36ee474e3b3f4339b"} Mar 20 16:24:10 crc kubenswrapper[4675]: I0320 16:24:10.688019 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ab0c9f-e49c-4456-8515-9b4363d38531" path="/var/lib/kubelet/pods/a6ab0c9f-e49c-4456-8515-9b4363d38531/volumes" Mar 20 16:24:11 crc kubenswrapper[4675]: I0320 16:24:11.624926 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerStarted","Data":"e0c5591cddeb0434d1d8c09584451c7acdabc6ef73c7423de0b7479050701dcc"} Mar 20 16:24:12 crc kubenswrapper[4675]: I0320 16:24:12.640327 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerStarted","Data":"83a191ea65813943c763870c0cebf256c173bb5b3a0068521210e20a554684b4"} Mar 20 16:24:14 crc kubenswrapper[4675]: I0320 16:24:14.666184 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerStarted","Data":"ab3ebb1bc412726c5f5c88269795bfc729d4c428b63a02c6fd2cd70ef3156880"} Mar 20 16:24:16 crc kubenswrapper[4675]: I0320 16:24:16.699593 4675 generic.go:334] "Generic (PLEG): container finished" podID="e5eba599-99e1-4899-8ae7-0ba38e60724b" containerID="c939029382189ad85b94113049399625b490c4c552687aac5df5d72f3dbe945e" exitCode=0 Mar 20 16:24:16 crc kubenswrapper[4675]: I0320 16:24:16.701410 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" event={"ID":"e5eba599-99e1-4899-8ae7-0ba38e60724b","Type":"ContainerDied","Data":"c939029382189ad85b94113049399625b490c4c552687aac5df5d72f3dbe945e"} Mar 20 16:24:16 crc kubenswrapper[4675]: I0320 16:24:16.706652 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerStarted","Data":"921d192ff6602ea774199176cb4a46efefee38889072b25d0c2f9a7089c91822"} Mar 20 16:24:16 crc kubenswrapper[4675]: I0320 16:24:16.706897 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:24:16 crc kubenswrapper[4675]: I0320 16:24:16.755679 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.264683262 podStartE2EDuration="7.75564726s" podCreationTimestamp="2026-03-20 16:24:09 +0000 UTC" firstStartedPulling="2026-03-20 16:24:10.593952799 +0000 UTC m=+1370.627582356" lastFinishedPulling="2026-03-20 16:24:16.084916817 +0000 UTC m=+1376.118546354" observedRunningTime="2026-03-20 16:24:16.747384476 +0000 UTC m=+1376.781014043" watchObservedRunningTime="2026-03-20 16:24:16.75564726 +0000 UTC m=+1376.789276837" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.054998 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.179041 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-scripts\") pod \"e5eba599-99e1-4899-8ae7-0ba38e60724b\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.179128 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-combined-ca-bundle\") pod \"e5eba599-99e1-4899-8ae7-0ba38e60724b\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.179190 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-config-data\") pod \"e5eba599-99e1-4899-8ae7-0ba38e60724b\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.179277 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6gl4\" (UniqueName: \"kubernetes.io/projected/e5eba599-99e1-4899-8ae7-0ba38e60724b-kube-api-access-w6gl4\") pod \"e5eba599-99e1-4899-8ae7-0ba38e60724b\" (UID: \"e5eba599-99e1-4899-8ae7-0ba38e60724b\") " Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.184464 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-scripts" (OuterVolumeSpecName: "scripts") pod "e5eba599-99e1-4899-8ae7-0ba38e60724b" (UID: "e5eba599-99e1-4899-8ae7-0ba38e60724b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.184651 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5eba599-99e1-4899-8ae7-0ba38e60724b-kube-api-access-w6gl4" (OuterVolumeSpecName: "kube-api-access-w6gl4") pod "e5eba599-99e1-4899-8ae7-0ba38e60724b" (UID: "e5eba599-99e1-4899-8ae7-0ba38e60724b"). InnerVolumeSpecName "kube-api-access-w6gl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.205090 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5eba599-99e1-4899-8ae7-0ba38e60724b" (UID: "e5eba599-99e1-4899-8ae7-0ba38e60724b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.205966 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-config-data" (OuterVolumeSpecName: "config-data") pod "e5eba599-99e1-4899-8ae7-0ba38e60724b" (UID: "e5eba599-99e1-4899-8ae7-0ba38e60724b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.282518 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.282556 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.282565 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eba599-99e1-4899-8ae7-0ba38e60724b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.282574 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6gl4\" (UniqueName: \"kubernetes.io/projected/e5eba599-99e1-4899-8ae7-0ba38e60724b-kube-api-access-w6gl4\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.755879 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" event={"ID":"e5eba599-99e1-4899-8ae7-0ba38e60724b","Type":"ContainerDied","Data":"acfea37de0b66495ee6daf2edf8a864d9f10007efbd5e7181fdc2bcda5604d43"} Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.755920 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cvcgw" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.755925 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acfea37de0b66495ee6daf2edf8a864d9f10007efbd5e7181fdc2bcda5604d43" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.826500 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 16:24:18 crc kubenswrapper[4675]: E0320 16:24:18.826895 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5eba599-99e1-4899-8ae7-0ba38e60724b" containerName="nova-cell0-conductor-db-sync" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.826916 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eba599-99e1-4899-8ae7-0ba38e60724b" containerName="nova-cell0-conductor-db-sync" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.827131 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5eba599-99e1-4899-8ae7-0ba38e60724b" containerName="nova-cell0-conductor-db-sync" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.827823 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.830942 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gfrf2" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.831750 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.842224 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.996483 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.996829 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:18 crc kubenswrapper[4675]: I0320 16:24:18.996855 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f447d\" (UniqueName: \"kubernetes.io/projected/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-kube-api-access-f447d\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.098923 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.099020 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f447d\" (UniqueName: \"kubernetes.io/projected/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-kube-api-access-f447d\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.099051 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.120668 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.120684 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.138470 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f447d\" (UniqueName: \"kubernetes.io/projected/965c51c2-a3e5-46f3-8a76-3cdb812c96c6-kube-api-access-f447d\") pod \"nova-cell0-conductor-0\" (UID: \"965c51c2-a3e5-46f3-8a76-3cdb812c96c6\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.153198 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.577658 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 16:24:19 crc kubenswrapper[4675]: I0320 16:24:19.768251 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"965c51c2-a3e5-46f3-8a76-3cdb812c96c6","Type":"ContainerStarted","Data":"5c5f848bc5eb52bba6ea573f48fbf561a7cb3eeabebeb8ec234a8090ebf3794d"} Mar 20 16:24:20 crc kubenswrapper[4675]: I0320 16:24:20.777632 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"965c51c2-a3e5-46f3-8a76-3cdb812c96c6","Type":"ContainerStarted","Data":"fe933cc77dcb13e1d82be308ee9e48b3e930cc10959d899806447ed494a65c54"} Mar 20 16:24:20 crc kubenswrapper[4675]: I0320 16:24:20.777723 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:20 crc kubenswrapper[4675]: I0320 16:24:20.802051 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.802029736 podStartE2EDuration="2.802029736s" podCreationTimestamp="2026-03-20 16:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:20.791112578 +0000 UTC m=+1380.824742175" watchObservedRunningTime="2026-03-20 16:24:20.802029736 +0000 UTC m=+1380.835659303" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.181576 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.605927 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gjbkq"] Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.607761 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.612946 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.613244 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.615549 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gjbkq"] Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.707511 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-config-data\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.707620 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxdxm\" (UniqueName: \"kubernetes.io/projected/3f80fe29-bb20-44a9-a687-24bc81243833-kube-api-access-nxdxm\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.707672 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-scripts\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.707748 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.784150 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.787072 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.795669 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.808912 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.809854 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-config-data\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.809941 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxdxm\" (UniqueName: \"kubernetes.io/projected/3f80fe29-bb20-44a9-a687-24bc81243833-kube-api-access-nxdxm\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.809972 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-scripts\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.810026 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.819083 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-config-data\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.819606 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.857566 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-scripts\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.897568 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxdxm\" (UniqueName: \"kubernetes.io/projected/3f80fe29-bb20-44a9-a687-24bc81243833-kube-api-access-nxdxm\") pod \"nova-cell0-cell-mapping-gjbkq\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.917090 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.917269 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bllk\" (UniqueName: \"kubernetes.io/projected/b643d9f5-19ce-4094-8593-7364062a69de-kube-api-access-8bllk\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.917304 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-config-data\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.917525 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b643d9f5-19ce-4094-8593-7364062a69de-logs\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.941816 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:24 crc kubenswrapper[4675]: I0320 16:24:24.979252 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.008398 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.014739 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.019838 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bllk\" (UniqueName: \"kubernetes.io/projected/b643d9f5-19ce-4094-8593-7364062a69de-kube-api-access-8bllk\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.019895 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-config-data\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.019958 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.019976 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgk8s\" (UniqueName: \"kubernetes.io/projected/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-kube-api-access-qgk8s\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.020023 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b643d9f5-19ce-4094-8593-7364062a69de-logs\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.020062 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.020090 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.021108 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b643d9f5-19ce-4094-8593-7364062a69de-logs\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.026646 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-config-data\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.033999 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.066620 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.077471 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.077668 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bllk\" (UniqueName: \"kubernetes.io/projected/b643d9f5-19ce-4094-8593-7364062a69de-kube-api-access-8bllk\") pod \"nova-api-0\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.081909 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.086268 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.109112 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.111079 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.114212 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121177 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121226 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-config-data\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121265 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mv4q\" (UniqueName: \"kubernetes.io/projected/2ce2d8c1-645a-4ee0-9023-0029c68e0732-kube-api-access-2mv4q\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121320 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-config-data\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121335 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121357 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121376 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgk8s\" (UniqueName: \"kubernetes.io/projected/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-kube-api-access-qgk8s\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121408 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnzk\" (UniqueName: \"kubernetes.io/projected/6a20f49a-f98a-4a00-839d-5c57cf7551a2-kube-api-access-rtnzk\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121434 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.121459 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce2d8c1-645a-4ee0-9023-0029c68e0732-logs\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.126210 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.127803 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.134560 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.143139 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.144968 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgk8s\" (UniqueName: \"kubernetes.io/projected/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-kube-api-access-qgk8s\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.166860 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4ntqw"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.168538 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.218515 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4ntqw"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222541 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222598 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mv4q\" (UniqueName: \"kubernetes.io/projected/2ce2d8c1-645a-4ee0-9023-0029c68e0732-kube-api-access-2mv4q\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222714 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222825 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-config-data\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222858 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222923 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-config\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222956 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnzk\" (UniqueName: \"kubernetes.io/projected/6a20f49a-f98a-4a00-839d-5c57cf7551a2-kube-api-access-rtnzk\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.222997 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.223032 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.223052 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce2d8c1-645a-4ee0-9023-0029c68e0732-logs\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.223092 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x956k\" (UniqueName: \"kubernetes.io/projected/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-kube-api-access-x956k\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.223129 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.223165 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-config-data\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.224814 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce2d8c1-645a-4ee0-9023-0029c68e0732-logs\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.242046 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.242484 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.244343 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnzk\" (UniqueName: \"kubernetes.io/projected/6a20f49a-f98a-4a00-839d-5c57cf7551a2-kube-api-access-rtnzk\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.244878 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mv4q\" (UniqueName: \"kubernetes.io/projected/2ce2d8c1-645a-4ee0-9023-0029c68e0732-kube-api-access-2mv4q\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.256151 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-config-data\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.256388 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.258382 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-config-data\") pod \"nova-scheduler-0\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.325529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.325613 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.325674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-config\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.325718 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.325746 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x956k\" (UniqueName: \"kubernetes.io/projected/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-kube-api-access-x956k\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.325801 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.326849 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-config\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.326880 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.328068 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.328114 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.328432 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.360023 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x956k\" (UniqueName: \"kubernetes.io/projected/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-kube-api-access-x956k\") pod \"dnsmasq-dns-845d6d6f59-4ntqw\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.413452 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.435467 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.450569 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.508355 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.560103 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.602984 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gjbkq"] Mar 20 16:24:25 crc kubenswrapper[4675]: W0320 16:24:25.632100 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb643d9f5_19ce_4094_8593_7364062a69de.slice/crio-8391fed66262da15dbb2f1f24745d73d9621b3c76b5df7fcf24dd33c67ccab76 WatchSource:0}: Error finding container 8391fed66262da15dbb2f1f24745d73d9621b3c76b5df7fcf24dd33c67ccab76: Status 404 returned error can't find the container with id 8391fed66262da15dbb2f1f24745d73d9621b3c76b5df7fcf24dd33c67ccab76 Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.675967 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zbgx5"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.678117 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.680026 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.684115 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.715115 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zbgx5"] Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.744352 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5x9w\" (UniqueName: \"kubernetes.io/projected/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-kube-api-access-v5x9w\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.744616 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-config-data\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.744649 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-scripts\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.744909 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.846091 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5x9w\" (UniqueName: \"kubernetes.io/projected/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-kube-api-access-v5x9w\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.846139 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-config-data\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.846164 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-scripts\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.846244 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.851413 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.851602 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-config-data\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.859655 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-scripts\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.871025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5x9w\" (UniqueName: \"kubernetes.io/projected/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-kube-api-access-v5x9w\") pod \"nova-cell1-conductor-db-sync-zbgx5\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.945697 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gjbkq" event={"ID":"3f80fe29-bb20-44a9-a687-24bc81243833","Type":"ContainerStarted","Data":"f3bf4a83559c2e6f4bc3f6b30050c34dca0f8f21efe574849db86a45a8664d7b"} Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.946197 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gjbkq" event={"ID":"3f80fe29-bb20-44a9-a687-24bc81243833","Type":"ContainerStarted","Data":"6fd0dea373d907c346461fddafa551ecfaab9d6de56dc7daa96b038acf8a30c1"} Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.948333 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b643d9f5-19ce-4094-8593-7364062a69de","Type":"ContainerStarted","Data":"8391fed66262da15dbb2f1f24745d73d9621b3c76b5df7fcf24dd33c67ccab76"} Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.961283 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gjbkq" podStartSLOduration=1.961265531 podStartE2EDuration="1.961265531s" podCreationTimestamp="2026-03-20 16:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:25.959906222 +0000 UTC m=+1385.993535779" watchObservedRunningTime="2026-03-20 16:24:25.961265531 +0000 UTC m=+1385.994895068" Mar 20 16:24:25 crc kubenswrapper[4675]: I0320 16:24:25.999407 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.054810 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.063828 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.265622 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4ntqw"] Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.281374 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:26 crc kubenswrapper[4675]: W0320 16:24:26.282867 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ce2d8c1_645a_4ee0_9023_0029c68e0732.slice/crio-443ac6fbd8588837c53d9095dfd1e8238de42ddd0077609870e48caa216bc4e7 WatchSource:0}: Error finding container 443ac6fbd8588837c53d9095dfd1e8238de42ddd0077609870e48caa216bc4e7: Status 404 returned error can't find the container with id 443ac6fbd8588837c53d9095dfd1e8238de42ddd0077609870e48caa216bc4e7 Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.547861 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zbgx5"] Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.974028 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ce2d8c1-645a-4ee0-9023-0029c68e0732","Type":"ContainerStarted","Data":"443ac6fbd8588837c53d9095dfd1e8238de42ddd0077609870e48caa216bc4e7"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.977190 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" event={"ID":"b9ddb96c-6950-4300-bf0d-cf65d46c12fb","Type":"ContainerStarted","Data":"a43de24ddb10ac9b6b35757dcf8b118bcc5ee17c54a179acc3a9a5fad42ae441"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.977249 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" event={"ID":"b9ddb96c-6950-4300-bf0d-cf65d46c12fb","Type":"ContainerStarted","Data":"d373a62e63e9e950263a399f4cc7e953f1dcfef1d2619bfd566fae0d288ec5c5"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.981055 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94f8da9-3074-4101-a3ed-f1bddacf8ce5","Type":"ContainerStarted","Data":"9b4c2d450b42c011af14d71117cfcae21215158a247df3d18f19bb20b84dcf51"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.984867 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a20f49a-f98a-4a00-839d-5c57cf7551a2","Type":"ContainerStarted","Data":"19797b313a76a3fbaa5e9938628bcede59bc123b35064cb219cdbe1e2d2b1499"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.994568 4675 generic.go:334] "Generic (PLEG): container finished" podID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerID="19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637" exitCode=0 Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.994634 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" event={"ID":"bb0ed576-df0b-45b0-ac22-4848f08bcbfb","Type":"ContainerDied","Data":"19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.994941 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" event={"ID":"bb0ed576-df0b-45b0-ac22-4848f08bcbfb","Type":"ContainerStarted","Data":"48f7e89e1e3053a0954d0cd54cca116e177bc34a14ce0e15f9b1d90736ddc964"} Mar 20 16:24:26 crc kubenswrapper[4675]: I0320 16:24:26.996481 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" podStartSLOduration=1.996433202 podStartE2EDuration="1.996433202s" podCreationTimestamp="2026-03-20 16:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:26.992583993 +0000 UTC m=+1387.026213530" watchObservedRunningTime="2026-03-20 16:24:26.996433202 +0000 UTC m=+1387.030062739" Mar 20 16:24:28 crc kubenswrapper[4675]: I0320 16:24:28.013177 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" event={"ID":"bb0ed576-df0b-45b0-ac22-4848f08bcbfb","Type":"ContainerStarted","Data":"14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9"} Mar 20 16:24:28 crc kubenswrapper[4675]: I0320 16:24:28.013537 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:28 crc kubenswrapper[4675]: I0320 16:24:28.036316 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" podStartSLOduration=3.036293696 podStartE2EDuration="3.036293696s" podCreationTimestamp="2026-03-20 16:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:28.031456979 +0000 UTC m=+1388.065086526" watchObservedRunningTime="2026-03-20 16:24:28.036293696 +0000 UTC m=+1388.069923233" Mar 20 16:24:28 crc kubenswrapper[4675]: I0320 16:24:28.512178 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:28 crc kubenswrapper[4675]: I0320 16:24:28.525086 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.035074 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b643d9f5-19ce-4094-8593-7364062a69de","Type":"ContainerStarted","Data":"44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf"} Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.035398 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b643d9f5-19ce-4094-8593-7364062a69de","Type":"ContainerStarted","Data":"488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3"} Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.038186 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a20f49a-f98a-4a00-839d-5c57cf7551a2","Type":"ContainerStarted","Data":"af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645"} Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.041002 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ce2d8c1-645a-4ee0-9023-0029c68e0732","Type":"ContainerStarted","Data":"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5"} Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.041064 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ce2d8c1-645a-4ee0-9023-0029c68e0732","Type":"ContainerStarted","Data":"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c"} Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.041236 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-log" containerID="cri-o://da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c" gracePeriod=30 Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.041621 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-metadata" containerID="cri-o://2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5" gracePeriod=30 Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.048040 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94f8da9-3074-4101-a3ed-f1bddacf8ce5","Type":"ContainerStarted","Data":"ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d"} Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.048486 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a94f8da9-3074-4101-a3ed-f1bddacf8ce5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d" gracePeriod=30 Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.057974 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.80197222 podStartE2EDuration="6.057957931s" podCreationTimestamp="2026-03-20 16:24:24 +0000 UTC" firstStartedPulling="2026-03-20 16:24:25.630560216 +0000 UTC m=+1385.664189753" lastFinishedPulling="2026-03-20 16:24:28.886545927 +0000 UTC m=+1388.920175464" observedRunningTime="2026-03-20 16:24:30.056909611 +0000 UTC m=+1390.090539198" watchObservedRunningTime="2026-03-20 16:24:30.057957931 +0000 UTC m=+1390.091587468" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.085903 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.475742288 podStartE2EDuration="6.08588383s" podCreationTimestamp="2026-03-20 16:24:24 +0000 UTC" firstStartedPulling="2026-03-20 16:24:26.286337686 +0000 UTC m=+1386.319967233" lastFinishedPulling="2026-03-20 16:24:28.896479228 +0000 UTC m=+1388.930108775" observedRunningTime="2026-03-20 16:24:30.080073716 +0000 UTC m=+1390.113703253" watchObservedRunningTime="2026-03-20 16:24:30.08588383 +0000 UTC m=+1390.119513377" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.101761 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.318802649 podStartE2EDuration="6.101737049s" podCreationTimestamp="2026-03-20 16:24:24 +0000 UTC" firstStartedPulling="2026-03-20 16:24:26.103610657 +0000 UTC m=+1386.137240194" lastFinishedPulling="2026-03-20 16:24:28.886545057 +0000 UTC m=+1388.920174594" observedRunningTime="2026-03-20 16:24:30.095543534 +0000 UTC m=+1390.129173071" watchObservedRunningTime="2026-03-20 16:24:30.101737049 +0000 UTC m=+1390.135366596" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.114723 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.293496375 podStartE2EDuration="6.114701856s" podCreationTimestamp="2026-03-20 16:24:24 +0000 UTC" firstStartedPulling="2026-03-20 16:24:26.065549871 +0000 UTC m=+1386.099179408" lastFinishedPulling="2026-03-20 16:24:28.886755352 +0000 UTC m=+1388.920384889" observedRunningTime="2026-03-20 16:24:30.112528554 +0000 UTC m=+1390.146158091" watchObservedRunningTime="2026-03-20 16:24:30.114701856 +0000 UTC m=+1390.148331393" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.413748 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.436530 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.629018 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.695808 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-config-data\") pod \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.696015 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce2d8c1-645a-4ee0-9023-0029c68e0732-logs\") pod \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.696039 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-combined-ca-bundle\") pod \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.696366 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mv4q\" (UniqueName: \"kubernetes.io/projected/2ce2d8c1-645a-4ee0-9023-0029c68e0732-kube-api-access-2mv4q\") pod \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\" (UID: \"2ce2d8c1-645a-4ee0-9023-0029c68e0732\") " Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.696396 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce2d8c1-645a-4ee0-9023-0029c68e0732-logs" (OuterVolumeSpecName: "logs") pod "2ce2d8c1-645a-4ee0-9023-0029c68e0732" (UID: "2ce2d8c1-645a-4ee0-9023-0029c68e0732"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.696917 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce2d8c1-645a-4ee0-9023-0029c68e0732-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.702412 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce2d8c1-645a-4ee0-9023-0029c68e0732-kube-api-access-2mv4q" (OuterVolumeSpecName: "kube-api-access-2mv4q") pod "2ce2d8c1-645a-4ee0-9023-0029c68e0732" (UID: "2ce2d8c1-645a-4ee0-9023-0029c68e0732"). InnerVolumeSpecName "kube-api-access-2mv4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.720745 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ce2d8c1-645a-4ee0-9023-0029c68e0732" (UID: "2ce2d8c1-645a-4ee0-9023-0029c68e0732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.722375 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-config-data" (OuterVolumeSpecName: "config-data") pod "2ce2d8c1-645a-4ee0-9023-0029c68e0732" (UID: "2ce2d8c1-645a-4ee0-9023-0029c68e0732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.799600 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.799635 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mv4q\" (UniqueName: \"kubernetes.io/projected/2ce2d8c1-645a-4ee0-9023-0029c68e0732-kube-api-access-2mv4q\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:30 crc kubenswrapper[4675]: I0320 16:24:30.799651 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce2d8c1-645a-4ee0-9023-0029c68e0732-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.063912 4675 generic.go:334] "Generic (PLEG): container finished" podID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerID="2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5" exitCode=0 Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.065081 4675 generic.go:334] "Generic (PLEG): container finished" podID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerID="da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c" exitCode=143 Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.064088 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ce2d8c1-645a-4ee0-9023-0029c68e0732","Type":"ContainerDied","Data":"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5"} Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.066209 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ce2d8c1-645a-4ee0-9023-0029c68e0732","Type":"ContainerDied","Data":"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c"} Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.066247 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ce2d8c1-645a-4ee0-9023-0029c68e0732","Type":"ContainerDied","Data":"443ac6fbd8588837c53d9095dfd1e8238de42ddd0077609870e48caa216bc4e7"} Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.066268 4675 scope.go:117] "RemoveContainer" containerID="2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.064062 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.103350 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.108820 4675 scope.go:117] "RemoveContainer" containerID="da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.111478 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.130524 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:31 crc kubenswrapper[4675]: E0320 16:24:31.130975 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-metadata" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.130988 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-metadata" Mar 20 16:24:31 crc kubenswrapper[4675]: E0320 16:24:31.131007 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-log" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.131013 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-log" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.131176 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-metadata" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.131186 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" containerName="nova-metadata-log" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.132187 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.134431 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.135457 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.147060 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.149555 4675 scope.go:117] "RemoveContainer" containerID="2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5" Mar 20 16:24:31 crc kubenswrapper[4675]: E0320 16:24:31.150546 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5\": container with ID starting with 2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5 not found: ID does not exist" containerID="2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.150591 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5"} err="failed to get container status \"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5\": rpc error: code = NotFound desc = could not find container \"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5\": container with ID starting with 2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5 not found: ID does not exist" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.150629 4675 scope.go:117] "RemoveContainer" containerID="da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c" Mar 20 16:24:31 crc kubenswrapper[4675]: E0320 16:24:31.154023 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c\": container with ID starting with da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c not found: ID does not exist" containerID="da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.154180 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c"} err="failed to get container status \"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c\": rpc error: code = NotFound desc = could not find container \"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c\": container with ID starting with da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c not found: ID does not exist" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.154282 4675 scope.go:117] "RemoveContainer" containerID="2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.155757 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5"} err="failed to get container status \"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5\": rpc error: code = NotFound desc = could not find container \"2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5\": container with ID starting with 2d29f245cbbb149d181f7800ca20b774b326951482dd4be8f69d9bf2488a3ab5 not found: ID does not exist" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.155813 4675 scope.go:117] "RemoveContainer" containerID="da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.156188 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c"} err="failed to get container status \"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c\": rpc error: code = NotFound desc = could not find container \"da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c\": container with ID starting with da615e6c935d62e7dd149ef9b2cc35bd3b42e326eb898caf0e003ea159fde54c not found: ID does not exist" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.207404 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-logs\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.208620 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.209137 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-config-data\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.209246 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cp56\" (UniqueName: \"kubernetes.io/projected/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-kube-api-access-7cp56\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.209344 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.311066 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.311243 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-logs\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.311827 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-logs\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.312000 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.312495 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-config-data\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.312573 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cp56\" (UniqueName: \"kubernetes.io/projected/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-kube-api-access-7cp56\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.315788 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-config-data\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.316318 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.339790 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.342294 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cp56\" (UniqueName: \"kubernetes.io/projected/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-kube-api-access-7cp56\") pod \"nova-metadata-0\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.454338 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:31 crc kubenswrapper[4675]: I0320 16:24:31.907444 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:32 crc kubenswrapper[4675]: I0320 16:24:32.081650 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf","Type":"ContainerStarted","Data":"3c25f612b37c58e2ff5ef56670fffcf78813e736b29631ca940a85cf733f2bd2"} Mar 20 16:24:32 crc kubenswrapper[4675]: I0320 16:24:32.698474 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce2d8c1-645a-4ee0-9023-0029c68e0732" path="/var/lib/kubelet/pods/2ce2d8c1-645a-4ee0-9023-0029c68e0732/volumes" Mar 20 16:24:33 crc kubenswrapper[4675]: I0320 16:24:33.093578 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf","Type":"ContainerStarted","Data":"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c"} Mar 20 16:24:33 crc kubenswrapper[4675]: I0320 16:24:33.095187 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf","Type":"ContainerStarted","Data":"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82"} Mar 20 16:24:33 crc kubenswrapper[4675]: I0320 16:24:33.113497 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.113480729 podStartE2EDuration="2.113480729s" podCreationTimestamp="2026-03-20 16:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:33.109582949 +0000 UTC m=+1393.143212486" watchObservedRunningTime="2026-03-20 16:24:33.113480729 +0000 UTC m=+1393.147110266" Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.105473 4675 generic.go:334] "Generic (PLEG): container finished" podID="3f80fe29-bb20-44a9-a687-24bc81243833" containerID="f3bf4a83559c2e6f4bc3f6b30050c34dca0f8f21efe574849db86a45a8664d7b" exitCode=0 Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.105633 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gjbkq" event={"ID":"3f80fe29-bb20-44a9-a687-24bc81243833","Type":"ContainerDied","Data":"f3bf4a83559c2e6f4bc3f6b30050c34dca0f8f21efe574849db86a45a8664d7b"} Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.108371 4675 generic.go:334] "Generic (PLEG): container finished" podID="b9ddb96c-6950-4300-bf0d-cf65d46c12fb" containerID="a43de24ddb10ac9b6b35757dcf8b118bcc5ee17c54a179acc3a9a5fad42ae441" exitCode=0 Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.108470 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" event={"ID":"b9ddb96c-6950-4300-bf0d-cf65d46c12fb","Type":"ContainerDied","Data":"a43de24ddb10ac9b6b35757dcf8b118bcc5ee17c54a179acc3a9a5fad42ae441"} Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.425015 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.425091 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.425541 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.426519 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7732914d5ec5c37cec22ffa5532f80bae40c4bcbf0ea409824aff0266bbf1edb"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:24:34 crc kubenswrapper[4675]: I0320 16:24:34.426578 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://7732914d5ec5c37cec22ffa5532f80bae40c4bcbf0ea409824aff0266bbf1edb" gracePeriod=600 Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.125605 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="7732914d5ec5c37cec22ffa5532f80bae40c4bcbf0ea409824aff0266bbf1edb" exitCode=0 Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.125691 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"7732914d5ec5c37cec22ffa5532f80bae40c4bcbf0ea409824aff0266bbf1edb"} Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.126282 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3"} Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.126676 4675 scope.go:117] "RemoveContainer" containerID="82c9eee9831702a396f1ab945d5ca7dca1e7cbf4d14cc472240a2d6bc5bec93c" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.243686 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.243731 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.436569 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.475423 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.510746 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.589726 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2dwd4"] Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.594147 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerName="dnsmasq-dns" containerID="cri-o://0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b" gracePeriod=10 Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.678682 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.687656 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817015 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-combined-ca-bundle\") pod \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817085 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-config-data\") pod \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817149 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-scripts\") pod \"3f80fe29-bb20-44a9-a687-24bc81243833\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817179 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-config-data\") pod \"3f80fe29-bb20-44a9-a687-24bc81243833\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817220 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-combined-ca-bundle\") pod \"3f80fe29-bb20-44a9-a687-24bc81243833\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817366 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-scripts\") pod \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817421 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5x9w\" (UniqueName: \"kubernetes.io/projected/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-kube-api-access-v5x9w\") pod \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\" (UID: \"b9ddb96c-6950-4300-bf0d-cf65d46c12fb\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.817488 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxdxm\" (UniqueName: \"kubernetes.io/projected/3f80fe29-bb20-44a9-a687-24bc81243833-kube-api-access-nxdxm\") pod \"3f80fe29-bb20-44a9-a687-24bc81243833\" (UID: \"3f80fe29-bb20-44a9-a687-24bc81243833\") " Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.832956 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-scripts" (OuterVolumeSpecName: "scripts") pod "3f80fe29-bb20-44a9-a687-24bc81243833" (UID: "3f80fe29-bb20-44a9-a687-24bc81243833"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.833213 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f80fe29-bb20-44a9-a687-24bc81243833-kube-api-access-nxdxm" (OuterVolumeSpecName: "kube-api-access-nxdxm") pod "3f80fe29-bb20-44a9-a687-24bc81243833" (UID: "3f80fe29-bb20-44a9-a687-24bc81243833"). InnerVolumeSpecName "kube-api-access-nxdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.833961 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-scripts" (OuterVolumeSpecName: "scripts") pod "b9ddb96c-6950-4300-bf0d-cf65d46c12fb" (UID: "b9ddb96c-6950-4300-bf0d-cf65d46c12fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.849558 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-config-data" (OuterVolumeSpecName: "config-data") pod "b9ddb96c-6950-4300-bf0d-cf65d46c12fb" (UID: "b9ddb96c-6950-4300-bf0d-cf65d46c12fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.853609 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-kube-api-access-v5x9w" (OuterVolumeSpecName: "kube-api-access-v5x9w") pod "b9ddb96c-6950-4300-bf0d-cf65d46c12fb" (UID: "b9ddb96c-6950-4300-bf0d-cf65d46c12fb"). InnerVolumeSpecName "kube-api-access-v5x9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.867225 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9ddb96c-6950-4300-bf0d-cf65d46c12fb" (UID: "b9ddb96c-6950-4300-bf0d-cf65d46c12fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.870792 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f80fe29-bb20-44a9-a687-24bc81243833" (UID: "3f80fe29-bb20-44a9-a687-24bc81243833"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.873625 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-config-data" (OuterVolumeSpecName: "config-data") pod "3f80fe29-bb20-44a9-a687-24bc81243833" (UID: "3f80fe29-bb20-44a9-a687-24bc81243833"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920432 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920470 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5x9w\" (UniqueName: \"kubernetes.io/projected/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-kube-api-access-v5x9w\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920488 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxdxm\" (UniqueName: \"kubernetes.io/projected/3f80fe29-bb20-44a9-a687-24bc81243833-kube-api-access-nxdxm\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920501 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920513 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ddb96c-6950-4300-bf0d-cf65d46c12fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920526 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920537 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4675]: I0320 16:24:35.920547 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f80fe29-bb20-44a9-a687-24bc81243833-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.061724 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.122633 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-sb\") pod \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.122926 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-config\") pod \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.122991 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-svc\") pod \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.123066 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qv9m\" (UniqueName: \"kubernetes.io/projected/9da42c2c-0330-4ba7-9b59-46d4d84fe045-kube-api-access-6qv9m\") pod \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.123113 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-nb\") pod \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.123131 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-swift-storage-0\") pod \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\" (UID: \"9da42c2c-0330-4ba7-9b59-46d4d84fe045\") " Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.129866 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da42c2c-0330-4ba7-9b59-46d4d84fe045-kube-api-access-6qv9m" (OuterVolumeSpecName: "kube-api-access-6qv9m") pod "9da42c2c-0330-4ba7-9b59-46d4d84fe045" (UID: "9da42c2c-0330-4ba7-9b59-46d4d84fe045"). InnerVolumeSpecName "kube-api-access-6qv9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.142875 4675 generic.go:334] "Generic (PLEG): container finished" podID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerID="0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b" exitCode=0 Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.142954 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" event={"ID":"9da42c2c-0330-4ba7-9b59-46d4d84fe045","Type":"ContainerDied","Data":"0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b"} Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.142981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" event={"ID":"9da42c2c-0330-4ba7-9b59-46d4d84fe045","Type":"ContainerDied","Data":"8cb1cc99aff2e17049118026ad299a1f50e37b657fbc0d3787b459c0ffeb89b2"} Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.142996 4675 scope.go:117] "RemoveContainer" containerID="0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.143144 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2dwd4" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.198544 4675 scope.go:117] "RemoveContainer" containerID="ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.199278 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" event={"ID":"b9ddb96c-6950-4300-bf0d-cf65d46c12fb","Type":"ContainerDied","Data":"d373a62e63e9e950263a399f4cc7e953f1dcfef1d2619bfd566fae0d288ec5c5"} Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.199319 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d373a62e63e9e950263a399f4cc7e953f1dcfef1d2619bfd566fae0d288ec5c5" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.199443 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zbgx5" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.215473 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gjbkq" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.218967 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gjbkq" event={"ID":"3f80fe29-bb20-44a9-a687-24bc81243833","Type":"ContainerDied","Data":"6fd0dea373d907c346461fddafa551ecfaab9d6de56dc7daa96b038acf8a30c1"} Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.219005 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd0dea373d907c346461fddafa551ecfaab9d6de56dc7daa96b038acf8a30c1" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.224130 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-config" (OuterVolumeSpecName: "config") pod "9da42c2c-0330-4ba7-9b59-46d4d84fe045" (UID: "9da42c2c-0330-4ba7-9b59-46d4d84fe045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.226420 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qv9m\" (UniqueName: \"kubernetes.io/projected/9da42c2c-0330-4ba7-9b59-46d4d84fe045-kube-api-access-6qv9m\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.226442 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.236310 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9da42c2c-0330-4ba7-9b59-46d4d84fe045" (UID: "9da42c2c-0330-4ba7-9b59-46d4d84fe045"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.249476 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9da42c2c-0330-4ba7-9b59-46d4d84fe045" (UID: "9da42c2c-0330-4ba7-9b59-46d4d84fe045"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.253625 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9da42c2c-0330-4ba7-9b59-46d4d84fe045" (UID: "9da42c2c-0330-4ba7-9b59-46d4d84fe045"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.257627 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9da42c2c-0330-4ba7-9b59-46d4d84fe045" (UID: "9da42c2c-0330-4ba7-9b59-46d4d84fe045"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.262615 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.263290 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f80fe29-bb20-44a9-a687-24bc81243833" containerName="nova-manage" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.263450 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f80fe29-bb20-44a9-a687-24bc81243833" containerName="nova-manage" Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.263560 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerName="dnsmasq-dns" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.263639 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerName="dnsmasq-dns" Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.263749 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ddb96c-6950-4300-bf0d-cf65d46c12fb" containerName="nova-cell1-conductor-db-sync" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.263959 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ddb96c-6950-4300-bf0d-cf65d46c12fb" containerName="nova-cell1-conductor-db-sync" Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.264129 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerName="init" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.264220 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerName="init" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.264436 4675 scope.go:117] "RemoveContainer" containerID="0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.264969 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" containerName="dnsmasq-dns" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.265106 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ddb96c-6950-4300-bf0d-cf65d46c12fb" containerName="nova-cell1-conductor-db-sync" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.265221 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f80fe29-bb20-44a9-a687-24bc81243833" containerName="nova-manage" Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.265029 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b\": container with ID starting with 0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b not found: ID does not exist" containerID="0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.265492 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b"} err="failed to get container status \"0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b\": rpc error: code = NotFound desc = could not find container \"0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b\": container with ID starting with 0cfd4b70d0aa5ab1b913a1aedbde9d2977613606d3798a558644437b81e37b3b not found: ID does not exist" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.265525 4675 scope.go:117] "RemoveContainer" containerID="ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.266411 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.266608 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.269919 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.268684 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc\": container with ID starting with ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc not found: ID does not exist" containerID="ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.271951 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc"} err="failed to get container status \"ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc\": rpc error: code = NotFound desc = could not find container \"ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc\": container with ID starting with ab61d8443f2f1d5a90439da99cad599f3472464c180b991dbff20863a2239ebc not found: ID does not exist" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.280851 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.327851 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa48124b-637f-43d1-9136-24218d69e177-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.327934 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8874w\" (UniqueName: \"kubernetes.io/projected/fa48124b-637f-43d1-9136-24218d69e177-kube-api-access-8874w\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.328110 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa48124b-637f-43d1-9136-24218d69e177-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.328241 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.328256 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.328268 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.328281 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9da42c2c-0330-4ba7-9b59-46d4d84fe045-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.330112 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.330130 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:36 crc kubenswrapper[4675]: E0320 16:24:36.404405 4675 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9ddb96c_6950_4300_bf0d_cf65d46c12fb.slice/crio-d373a62e63e9e950263a399f4cc7e953f1dcfef1d2619bfd566fae0d288ec5c5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f80fe29_bb20_44a9_a687_24bc81243833.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9ddb96c_6950_4300_bf0d_cf65d46c12fb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.430649 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa48124b-637f-43d1-9136-24218d69e177-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.430972 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa48124b-637f-43d1-9136-24218d69e177-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.431097 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8874w\" (UniqueName: \"kubernetes.io/projected/fa48124b-637f-43d1-9136-24218d69e177-kube-api-access-8874w\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.434554 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa48124b-637f-43d1-9136-24218d69e177-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.437413 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa48124b-637f-43d1-9136-24218d69e177-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.448437 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8874w\" (UniqueName: \"kubernetes.io/projected/fa48124b-637f-43d1-9136-24218d69e177-kube-api-access-8874w\") pod \"nova-cell1-conductor-0\" (UID: \"fa48124b-637f-43d1-9136-24218d69e177\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.469839 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.470129 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-log" containerID="cri-o://488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3" gracePeriod=30 Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.470187 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-api" containerID="cri-o://44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf" gracePeriod=30 Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.485002 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2dwd4"] Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.501917 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2dwd4"] Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.515553 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.516046 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-log" containerID="cri-o://0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82" gracePeriod=30 Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.516113 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-metadata" containerID="cri-o://045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c" gracePeriod=30 Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.589728 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.700560 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da42c2c-0330-4ba7-9b59-46d4d84fe045" path="/var/lib/kubelet/pods/9da42c2c-0330-4ba7-9b59-46d4d84fe045/volumes" Mar 20 16:24:36 crc kubenswrapper[4675]: I0320 16:24:36.726126 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.121378 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.152891 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229713 4675 generic.go:334] "Generic (PLEG): container finished" podID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerID="045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c" exitCode=0 Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229748 4675 generic.go:334] "Generic (PLEG): container finished" podID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerID="0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82" exitCode=143 Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229790 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229811 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf","Type":"ContainerDied","Data":"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c"} Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229860 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf","Type":"ContainerDied","Data":"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82"} Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229874 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf","Type":"ContainerDied","Data":"3c25f612b37c58e2ff5ef56670fffcf78813e736b29631ca940a85cf733f2bd2"} Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.229875 4675 scope.go:117] "RemoveContainer" containerID="045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.235374 4675 generic.go:334] "Generic (PLEG): container finished" podID="b643d9f5-19ce-4094-8593-7364062a69de" containerID="488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3" exitCode=143 Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.235426 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b643d9f5-19ce-4094-8593-7364062a69de","Type":"ContainerDied","Data":"488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3"} Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.243897 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fa48124b-637f-43d1-9136-24218d69e177","Type":"ContainerStarted","Data":"cfe9224ea55c33a0cfb3771d25d327e6ac035004852e5acb2587e6a11590d06b"} Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.245071 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cp56\" (UniqueName: \"kubernetes.io/projected/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-kube-api-access-7cp56\") pod \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.245153 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-combined-ca-bundle\") pod \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.245269 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-logs\") pod \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.245396 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-nova-metadata-tls-certs\") pod \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.245422 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-config-data\") pod \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\" (UID: \"69e43f4e-7ab7-4f2b-86eb-3efea012e4bf\") " Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.245954 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-logs" (OuterVolumeSpecName: "logs") pod "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" (UID: "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.251216 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-kube-api-access-7cp56" (OuterVolumeSpecName: "kube-api-access-7cp56") pod "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" (UID: "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf"). InnerVolumeSpecName "kube-api-access-7cp56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.264177 4675 scope.go:117] "RemoveContainer" containerID="0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.280116 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-config-data" (OuterVolumeSpecName: "config-data") pod "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" (UID: "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.281293 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" (UID: "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.286621 4675 scope.go:117] "RemoveContainer" containerID="045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c" Mar 20 16:24:37 crc kubenswrapper[4675]: E0320 16:24:37.287375 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c\": container with ID starting with 045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c not found: ID does not exist" containerID="045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.287414 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c"} err="failed to get container status \"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c\": rpc error: code = NotFound desc = could not find container \"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c\": container with ID starting with 045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c not found: ID does not exist" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.287435 4675 scope.go:117] "RemoveContainer" containerID="0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82" Mar 20 16:24:37 crc kubenswrapper[4675]: E0320 16:24:37.287835 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82\": container with ID starting with 0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82 not found: ID does not exist" containerID="0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.287870 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82"} err="failed to get container status \"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82\": rpc error: code = NotFound desc = could not find container \"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82\": container with ID starting with 0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82 not found: ID does not exist" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.287894 4675 scope.go:117] "RemoveContainer" containerID="045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.288138 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c"} err="failed to get container status \"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c\": rpc error: code = NotFound desc = could not find container \"045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c\": container with ID starting with 045c10de5b3957be4e7978d6755fc6ebd33c751bbbb879bdd8c72d30bae1eb3c not found: ID does not exist" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.288165 4675 scope.go:117] "RemoveContainer" containerID="0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.288373 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82"} err="failed to get container status \"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82\": rpc error: code = NotFound desc = could not find container \"0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82\": container with ID starting with 0e70e9727663dfa26e17f26906d2a4b257a5604a3fad486eafd1df9b9af93c82 not found: ID does not exist" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.307176 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" (UID: "69e43f4e-7ab7-4f2b-86eb-3efea012e4bf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.347753 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.347812 4675 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.347829 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.347840 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cp56\" (UniqueName: \"kubernetes.io/projected/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-kube-api-access-7cp56\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.347850 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.625553 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.648330 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.666293 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:37 crc kubenswrapper[4675]: E0320 16:24:37.666801 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-metadata" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.666822 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-metadata" Mar 20 16:24:37 crc kubenswrapper[4675]: E0320 16:24:37.666877 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-log" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.666886 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-log" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.667109 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-log" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.667131 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" containerName="nova-metadata-metadata" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.668354 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.671973 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.672204 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.694130 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.756331 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-logs\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.756406 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m6w\" (UniqueName: \"kubernetes.io/projected/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-kube-api-access-l4m6w\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.756462 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-config-data\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.756539 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.756736 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.858227 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.858297 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-logs\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.858354 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m6w\" (UniqueName: \"kubernetes.io/projected/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-kube-api-access-l4m6w\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.858376 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-config-data\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.858423 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.858862 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-logs\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.862642 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.864339 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.866470 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-config-data\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.874871 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m6w\" (UniqueName: \"kubernetes.io/projected/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-kube-api-access-l4m6w\") pod \"nova-metadata-0\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4675]: I0320 16:24:37.989930 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:38 crc kubenswrapper[4675]: I0320 16:24:38.324233 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fa48124b-637f-43d1-9136-24218d69e177","Type":"ContainerStarted","Data":"0c155d7df55cab22273d35c85703fe434f4e83fe40948f0d48f0a3de182c92ec"} Mar 20 16:24:38 crc kubenswrapper[4675]: I0320 16:24:38.325012 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:38 crc kubenswrapper[4675]: I0320 16:24:38.342173 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" containerName="nova-scheduler-scheduler" containerID="cri-o://af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645" gracePeriod=30 Mar 20 16:24:38 crc kubenswrapper[4675]: I0320 16:24:38.357719 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.357700449 podStartE2EDuration="2.357700449s" podCreationTimestamp="2026-03-20 16:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:38.355753904 +0000 UTC m=+1398.389383441" watchObservedRunningTime="2026-03-20 16:24:38.357700449 +0000 UTC m=+1398.391329986" Mar 20 16:24:38 crc kubenswrapper[4675]: W0320 16:24:38.369607 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee16f9b2_f9aa_42c4_8e19_29da55f3c861.slice/crio-2e6b35eeccd82b4c61da48ec26acc8918a1f4ad00c36b799f8d5869233a05565 WatchSource:0}: Error finding container 2e6b35eeccd82b4c61da48ec26acc8918a1f4ad00c36b799f8d5869233a05565: Status 404 returned error can't find the container with id 2e6b35eeccd82b4c61da48ec26acc8918a1f4ad00c36b799f8d5869233a05565 Mar 20 16:24:38 crc kubenswrapper[4675]: I0320 16:24:38.385130 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:38 crc kubenswrapper[4675]: I0320 16:24:38.686921 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e43f4e-7ab7-4f2b-86eb-3efea012e4bf" path="/var/lib/kubelet/pods/69e43f4e-7ab7-4f2b-86eb-3efea012e4bf/volumes" Mar 20 16:24:39 crc kubenswrapper[4675]: I0320 16:24:39.358789 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee16f9b2-f9aa-42c4-8e19-29da55f3c861","Type":"ContainerStarted","Data":"65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81"} Mar 20 16:24:39 crc kubenswrapper[4675]: I0320 16:24:39.359133 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee16f9b2-f9aa-42c4-8e19-29da55f3c861","Type":"ContainerStarted","Data":"0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0"} Mar 20 16:24:39 crc kubenswrapper[4675]: I0320 16:24:39.359147 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee16f9b2-f9aa-42c4-8e19-29da55f3c861","Type":"ContainerStarted","Data":"2e6b35eeccd82b4c61da48ec26acc8918a1f4ad00c36b799f8d5869233a05565"} Mar 20 16:24:39 crc kubenswrapper[4675]: I0320 16:24:39.389143 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.389121455 podStartE2EDuration="2.389121455s" podCreationTimestamp="2026-03-20 16:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:39.384482733 +0000 UTC m=+1399.418112300" watchObservedRunningTime="2026-03-20 16:24:39.389121455 +0000 UTC m=+1399.422750992" Mar 20 16:24:40 crc kubenswrapper[4675]: I0320 16:24:40.075320 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 16:24:40 crc kubenswrapper[4675]: E0320 16:24:40.437842 4675 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:24:40 crc kubenswrapper[4675]: E0320 16:24:40.439078 4675 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:24:40 crc kubenswrapper[4675]: E0320 16:24:40.440226 4675 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:24:40 crc kubenswrapper[4675]: E0320 16:24:40.440270 4675 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" containerName="nova-scheduler-scheduler" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.384550 4675 generic.go:334] "Generic (PLEG): container finished" podID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" containerID="af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645" exitCode=0 Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.384672 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a20f49a-f98a-4a00-839d-5c57cf7551a2","Type":"ContainerDied","Data":"af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645"} Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.750123 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.831402 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtnzk\" (UniqueName: \"kubernetes.io/projected/6a20f49a-f98a-4a00-839d-5c57cf7551a2-kube-api-access-rtnzk\") pod \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.831527 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-config-data\") pod \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.831567 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-combined-ca-bundle\") pod \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\" (UID: \"6a20f49a-f98a-4a00-839d-5c57cf7551a2\") " Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.837402 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a20f49a-f98a-4a00-839d-5c57cf7551a2-kube-api-access-rtnzk" (OuterVolumeSpecName: "kube-api-access-rtnzk") pod "6a20f49a-f98a-4a00-839d-5c57cf7551a2" (UID: "6a20f49a-f98a-4a00-839d-5c57cf7551a2"). InnerVolumeSpecName "kube-api-access-rtnzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.861495 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a20f49a-f98a-4a00-839d-5c57cf7551a2" (UID: "6a20f49a-f98a-4a00-839d-5c57cf7551a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.870874 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-config-data" (OuterVolumeSpecName: "config-data") pod "6a20f49a-f98a-4a00-839d-5c57cf7551a2" (UID: "6a20f49a-f98a-4a00-839d-5c57cf7551a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.934733 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtnzk\" (UniqueName: \"kubernetes.io/projected/6a20f49a-f98a-4a00-839d-5c57cf7551a2-kube-api-access-rtnzk\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.934784 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4675]: I0320 16:24:41.934795 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20f49a-f98a-4a00-839d-5c57cf7551a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.274973 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.345068 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-config-data\") pod \"b643d9f5-19ce-4094-8593-7364062a69de\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.345206 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-combined-ca-bundle\") pod \"b643d9f5-19ce-4094-8593-7364062a69de\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.345282 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bllk\" (UniqueName: \"kubernetes.io/projected/b643d9f5-19ce-4094-8593-7364062a69de-kube-api-access-8bllk\") pod \"b643d9f5-19ce-4094-8593-7364062a69de\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.345379 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b643d9f5-19ce-4094-8593-7364062a69de-logs\") pod \"b643d9f5-19ce-4094-8593-7364062a69de\" (UID: \"b643d9f5-19ce-4094-8593-7364062a69de\") " Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.345845 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b643d9f5-19ce-4094-8593-7364062a69de-logs" (OuterVolumeSpecName: "logs") pod "b643d9f5-19ce-4094-8593-7364062a69de" (UID: "b643d9f5-19ce-4094-8593-7364062a69de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.346003 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b643d9f5-19ce-4094-8593-7364062a69de-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.350205 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b643d9f5-19ce-4094-8593-7364062a69de-kube-api-access-8bllk" (OuterVolumeSpecName: "kube-api-access-8bllk") pod "b643d9f5-19ce-4094-8593-7364062a69de" (UID: "b643d9f5-19ce-4094-8593-7364062a69de"). InnerVolumeSpecName "kube-api-access-8bllk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.374949 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b643d9f5-19ce-4094-8593-7364062a69de" (UID: "b643d9f5-19ce-4094-8593-7364062a69de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.385848 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-config-data" (OuterVolumeSpecName: "config-data") pod "b643d9f5-19ce-4094-8593-7364062a69de" (UID: "b643d9f5-19ce-4094-8593-7364062a69de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.398165 4675 generic.go:334] "Generic (PLEG): container finished" podID="b643d9f5-19ce-4094-8593-7364062a69de" containerID="44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf" exitCode=0 Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.398219 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.398558 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b643d9f5-19ce-4094-8593-7364062a69de","Type":"ContainerDied","Data":"44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf"} Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.398626 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b643d9f5-19ce-4094-8593-7364062a69de","Type":"ContainerDied","Data":"8391fed66262da15dbb2f1f24745d73d9621b3c76b5df7fcf24dd33c67ccab76"} Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.398652 4675 scope.go:117] "RemoveContainer" containerID="44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.407072 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a20f49a-f98a-4a00-839d-5c57cf7551a2","Type":"ContainerDied","Data":"19797b313a76a3fbaa5e9938628bcede59bc123b35064cb219cdbe1e2d2b1499"} Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.407132 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.449023 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.449297 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bllk\" (UniqueName: \"kubernetes.io/projected/b643d9f5-19ce-4094-8593-7364062a69de-kube-api-access-8bllk\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.449455 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b643d9f5-19ce-4094-8593-7364062a69de-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.490043 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.510265 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.511435 4675 scope.go:117] "RemoveContainer" containerID="488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.522405 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.530966 4675 scope.go:117] "RemoveContainer" containerID="44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf" Mar 20 16:24:42 crc kubenswrapper[4675]: E0320 16:24:42.531585 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf\": container with ID starting with 44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf not found: ID does not exist" containerID="44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.531703 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf"} err="failed to get container status \"44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf\": rpc error: code = NotFound desc = could not find container \"44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf\": container with ID starting with 44d51c0477305e8d9dbf2b74cdc02cba0690e19c44818d55bc3d90e6040e1cdf not found: ID does not exist" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.531799 4675 scope.go:117] "RemoveContainer" containerID="488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3" Mar 20 16:24:42 crc kubenswrapper[4675]: E0320 16:24:42.532101 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3\": container with ID starting with 488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3 not found: ID does not exist" containerID="488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.532184 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3"} err="failed to get container status \"488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3\": rpc error: code = NotFound desc = could not find container \"488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3\": container with ID starting with 488029ae353e76db90811c5f8ad1a237987a3be922ddb8e2d903229c6b62cff3 not found: ID does not exist" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.532259 4675 scope.go:117] "RemoveContainer" containerID="af449c6471a2c5f319df52be08fc7f6f8ee2f0f39c058444f8efa4c291fcb645" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.559213 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.571419 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: E0320 16:24:42.571793 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-api" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.571809 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-api" Mar 20 16:24:42 crc kubenswrapper[4675]: E0320 16:24:42.571824 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-log" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.571831 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-log" Mar 20 16:24:42 crc kubenswrapper[4675]: E0320 16:24:42.571843 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" containerName="nova-scheduler-scheduler" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.571849 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" containerName="nova-scheduler-scheduler" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.572213 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" containerName="nova-scheduler-scheduler" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.572230 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-log" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.572251 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b643d9f5-19ce-4094-8593-7364062a69de" containerName="nova-api-api" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.572832 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.577109 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.580897 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.582687 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.585146 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.590570 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.603265 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.654876 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-config-data\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.654922 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.654968 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfnpk\" (UniqueName: \"kubernetes.io/projected/070526b2-1aa9-4114-bf6a-327317631cf5-kube-api-access-cfnpk\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.655111 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mnq2\" (UniqueName: \"kubernetes.io/projected/fcdab570-2836-4f45-98c8-974aab349015-kube-api-access-6mnq2\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.655169 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.655215 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070526b2-1aa9-4114-bf6a-327317631cf5-logs\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.655262 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-config-data\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.684864 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a20f49a-f98a-4a00-839d-5c57cf7551a2" path="/var/lib/kubelet/pods/6a20f49a-f98a-4a00-839d-5c57cf7551a2/volumes" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.685598 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b643d9f5-19ce-4094-8593-7364062a69de" path="/var/lib/kubelet/pods/b643d9f5-19ce-4094-8593-7364062a69de/volumes" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756601 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070526b2-1aa9-4114-bf6a-327317631cf5-logs\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756710 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-config-data\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756739 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-config-data\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756778 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756818 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfnpk\" (UniqueName: \"kubernetes.io/projected/070526b2-1aa9-4114-bf6a-327317631cf5-kube-api-access-cfnpk\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756885 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mnq2\" (UniqueName: \"kubernetes.io/projected/fcdab570-2836-4f45-98c8-974aab349015-kube-api-access-6mnq2\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.756937 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.757163 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070526b2-1aa9-4114-bf6a-327317631cf5-logs\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.760759 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.760833 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.761991 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-config-data\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.763383 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-config-data\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.774999 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mnq2\" (UniqueName: \"kubernetes.io/projected/fcdab570-2836-4f45-98c8-974aab349015-kube-api-access-6mnq2\") pod \"nova-scheduler-0\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.776985 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfnpk\" (UniqueName: \"kubernetes.io/projected/070526b2-1aa9-4114-bf6a-327317631cf5-kube-api-access-cfnpk\") pod \"nova-api-0\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.890300 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:42 crc kubenswrapper[4675]: I0320 16:24:42.902562 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:43 crc kubenswrapper[4675]: I0320 16:24:43.446008 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:43 crc kubenswrapper[4675]: I0320 16:24:43.517094 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:43 crc kubenswrapper[4675]: I0320 16:24:43.556526 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:24:43 crc kubenswrapper[4675]: I0320 16:24:43.556838 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" containerName="kube-state-metrics" containerID="cri-o://0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327" gracePeriod=30 Mar 20 16:24:43 crc kubenswrapper[4675]: I0320 16:24:43.969464 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.097332 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jv6m\" (UniqueName: \"kubernetes.io/projected/f8f8e55f-429c-43cf-9aea-9524bf3caac7-kube-api-access-6jv6m\") pod \"f8f8e55f-429c-43cf-9aea-9524bf3caac7\" (UID: \"f8f8e55f-429c-43cf-9aea-9524bf3caac7\") " Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.102958 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f8e55f-429c-43cf-9aea-9524bf3caac7-kube-api-access-6jv6m" (OuterVolumeSpecName: "kube-api-access-6jv6m") pod "f8f8e55f-429c-43cf-9aea-9524bf3caac7" (UID: "f8f8e55f-429c-43cf-9aea-9524bf3caac7"). InnerVolumeSpecName "kube-api-access-6jv6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.199254 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jv6m\" (UniqueName: \"kubernetes.io/projected/f8f8e55f-429c-43cf-9aea-9524bf3caac7-kube-api-access-6jv6m\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.429469 4675 generic.go:334] "Generic (PLEG): container finished" podID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" containerID="0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327" exitCode=2 Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.430253 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8f8e55f-429c-43cf-9aea-9524bf3caac7","Type":"ContainerDied","Data":"0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.430365 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8f8e55f-429c-43cf-9aea-9524bf3caac7","Type":"ContainerDied","Data":"8472170ec9b8397231a9946daca64413562223bcbe0568878063b914a887da36"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.430461 4675 scope.go:117] "RemoveContainer" containerID="0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.430693 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.444593 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdab570-2836-4f45-98c8-974aab349015","Type":"ContainerStarted","Data":"24ded4262b64a4cb4e39d8797393d19c127c96aab7648815732be701ac90ba0c"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.444640 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdab570-2836-4f45-98c8-974aab349015","Type":"ContainerStarted","Data":"45bfa8e10ebf5064e93ad10ab407fd16672aade7c45816790e768e555aab85c8"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.448116 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"070526b2-1aa9-4114-bf6a-327317631cf5","Type":"ContainerStarted","Data":"a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.448152 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"070526b2-1aa9-4114-bf6a-327317631cf5","Type":"ContainerStarted","Data":"a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.448161 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"070526b2-1aa9-4114-bf6a-327317631cf5","Type":"ContainerStarted","Data":"c22843dd84fc8aaf7fcd68d4ada1f69fce82de463aea0c268983f26c778afd52"} Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.466563 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.466545435 podStartE2EDuration="2.466545435s" podCreationTimestamp="2026-03-20 16:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:44.460154815 +0000 UTC m=+1404.493784362" watchObservedRunningTime="2026-03-20 16:24:44.466545435 +0000 UTC m=+1404.500174972" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.481229 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.48120882 podStartE2EDuration="2.48120882s" podCreationTimestamp="2026-03-20 16:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:44.477260218 +0000 UTC m=+1404.510889755" watchObservedRunningTime="2026-03-20 16:24:44.48120882 +0000 UTC m=+1404.514838357" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.520831 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.522210 4675 scope.go:117] "RemoveContainer" containerID="0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327" Mar 20 16:24:44 crc kubenswrapper[4675]: E0320 16:24:44.522625 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327\": container with ID starting with 0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327 not found: ID does not exist" containerID="0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.522653 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327"} err="failed to get container status \"0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327\": rpc error: code = NotFound desc = could not find container \"0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327\": container with ID starting with 0cfb9d3e60c071b7f5b1b9dc4052f0782069f66dbf365d872540e6a8e25db327 not found: ID does not exist" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.531375 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.540592 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:24:44 crc kubenswrapper[4675]: E0320 16:24:44.541138 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" containerName="kube-state-metrics" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.541158 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" containerName="kube-state-metrics" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.541337 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" containerName="kube-state-metrics" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.541972 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.543893 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.545258 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.547684 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.620211 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc8xr\" (UniqueName: \"kubernetes.io/projected/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-api-access-zc8xr\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.620286 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.620331 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.620473 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.686787 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f8e55f-429c-43cf-9aea-9524bf3caac7" path="/var/lib/kubelet/pods/f8f8e55f-429c-43cf-9aea-9524bf3caac7/volumes" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.722229 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.722319 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc8xr\" (UniqueName: \"kubernetes.io/projected/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-api-access-zc8xr\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.722385 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.722410 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.729637 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.730590 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.731198 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.751329 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc8xr\" (UniqueName: \"kubernetes.io/projected/8f2c52ef-b68a-4077-80f4-455f7feb3f0e-kube-api-access-zc8xr\") pod \"kube-state-metrics-0\" (UID: \"8f2c52ef-b68a-4077-80f4-455f7feb3f0e\") " pod="openstack/kube-state-metrics-0" Mar 20 16:24:44 crc kubenswrapper[4675]: I0320 16:24:44.916292 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.374836 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.418388 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.418897 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="sg-core" containerID="cri-o://ab3ebb1bc412726c5f5c88269795bfc729d4c428b63a02c6fd2cd70ef3156880" gracePeriod=30 Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.418926 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="proxy-httpd" containerID="cri-o://921d192ff6602ea774199176cb4a46efefee38889072b25d0c2f9a7089c91822" gracePeriod=30 Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.418984 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-notification-agent" containerID="cri-o://83a191ea65813943c763870c0cebf256c173bb5b3a0068521210e20a554684b4" gracePeriod=30 Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.418858 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-central-agent" containerID="cri-o://e0c5591cddeb0434d1d8c09584451c7acdabc6ef73c7423de0b7479050701dcc" gracePeriod=30 Mar 20 16:24:45 crc kubenswrapper[4675]: I0320 16:24:45.457898 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8f2c52ef-b68a-4077-80f4-455f7feb3f0e","Type":"ContainerStarted","Data":"cf2e97487d63b87ed65bd3677df42ee7053141a0e779057fb761900eddb59085"} Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.473152 4675 generic.go:334] "Generic (PLEG): container finished" podID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerID="921d192ff6602ea774199176cb4a46efefee38889072b25d0c2f9a7089c91822" exitCode=0 Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.474201 4675 generic.go:334] "Generic (PLEG): container finished" podID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerID="ab3ebb1bc412726c5f5c88269795bfc729d4c428b63a02c6fd2cd70ef3156880" exitCode=2 Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.474271 4675 generic.go:334] "Generic (PLEG): container finished" podID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerID="e0c5591cddeb0434d1d8c09584451c7acdabc6ef73c7423de0b7479050701dcc" exitCode=0 Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.474276 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerDied","Data":"921d192ff6602ea774199176cb4a46efefee38889072b25d0c2f9a7089c91822"} Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.474436 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerDied","Data":"ab3ebb1bc412726c5f5c88269795bfc729d4c428b63a02c6fd2cd70ef3156880"} Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.474508 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerDied","Data":"e0c5591cddeb0434d1d8c09584451c7acdabc6ef73c7423de0b7479050701dcc"} Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.478387 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8f2c52ef-b68a-4077-80f4-455f7feb3f0e","Type":"ContainerStarted","Data":"f945d65a7ca615122307c70d37fad26185af416ae2587a1cdf1bbe0dc1d06918"} Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.478567 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.494435 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.111775793 podStartE2EDuration="2.494419117s" podCreationTimestamp="2026-03-20 16:24:44 +0000 UTC" firstStartedPulling="2026-03-20 16:24:45.384649355 +0000 UTC m=+1405.418278892" lastFinishedPulling="2026-03-20 16:24:45.767292679 +0000 UTC m=+1405.800922216" observedRunningTime="2026-03-20 16:24:46.49206376 +0000 UTC m=+1406.525693297" watchObservedRunningTime="2026-03-20 16:24:46.494419117 +0000 UTC m=+1406.528048654" Mar 20 16:24:46 crc kubenswrapper[4675]: I0320 16:24:46.636509 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:47 crc kubenswrapper[4675]: I0320 16:24:47.890464 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 16:24:47 crc kubenswrapper[4675]: I0320 16:24:47.990834 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:24:47 crc kubenswrapper[4675]: I0320 16:24:47.990892 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:24:48 crc kubenswrapper[4675]: I0320 16:24:48.501070 4675 generic.go:334] "Generic (PLEG): container finished" podID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerID="83a191ea65813943c763870c0cebf256c173bb5b3a0068521210e20a554684b4" exitCode=0 Mar 20 16:24:48 crc kubenswrapper[4675]: I0320 16:24:48.501240 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerDied","Data":"83a191ea65813943c763870c0cebf256c173bb5b3a0068521210e20a554684b4"} Mar 20 16:24:48 crc kubenswrapper[4675]: I0320 16:24:48.832858 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.002959 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.002996 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003181 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-config-data\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003395 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9krm4\" (UniqueName: \"kubernetes.io/projected/79f846df-cb1b-4441-b985-95d4b76be0fa-kube-api-access-9krm4\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003461 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-scripts\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003510 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-sg-core-conf-yaml\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003538 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-run-httpd\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003567 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-combined-ca-bundle\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.003608 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-log-httpd\") pod \"79f846df-cb1b-4441-b985-95d4b76be0fa\" (UID: \"79f846df-cb1b-4441-b985-95d4b76be0fa\") " Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.004633 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.004970 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.009051 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-scripts" (OuterVolumeSpecName: "scripts") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.035033 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f846df-cb1b-4441-b985-95d4b76be0fa-kube-api-access-9krm4" (OuterVolumeSpecName: "kube-api-access-9krm4") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "kube-api-access-9krm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.052020 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.106989 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9krm4\" (UniqueName: \"kubernetes.io/projected/79f846df-cb1b-4441-b985-95d4b76be0fa-kube-api-access-9krm4\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.107026 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.107037 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.107045 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.107054 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f846df-cb1b-4441-b985-95d4b76be0fa-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.122923 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-config-data" (OuterVolumeSpecName: "config-data") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.136894 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79f846df-cb1b-4441-b985-95d4b76be0fa" (UID: "79f846df-cb1b-4441-b985-95d4b76be0fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.210210 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.210261 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f846df-cb1b-4441-b985-95d4b76be0fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.514736 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f846df-cb1b-4441-b985-95d4b76be0fa","Type":"ContainerDied","Data":"69ce684bb7e82c664c02261c096634e9a403802db47247d36ee474e3b3f4339b"} Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.514814 4675 scope.go:117] "RemoveContainer" containerID="921d192ff6602ea774199176cb4a46efefee38889072b25d0c2f9a7089c91822" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.514996 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.587191 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.592336 4675 scope.go:117] "RemoveContainer" containerID="ab3ebb1bc412726c5f5c88269795bfc729d4c428b63a02c6fd2cd70ef3156880" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.612410 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.633000 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:49 crc kubenswrapper[4675]: E0320 16:24:49.634165 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-notification-agent" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.634271 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-notification-agent" Mar 20 16:24:49 crc kubenswrapper[4675]: E0320 16:24:49.634355 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="sg-core" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.634426 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="sg-core" Mar 20 16:24:49 crc kubenswrapper[4675]: E0320 16:24:49.634492 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="proxy-httpd" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.634561 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="proxy-httpd" Mar 20 16:24:49 crc kubenswrapper[4675]: E0320 16:24:49.634677 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-central-agent" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.634749 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-central-agent" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.635105 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-central-agent" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.639138 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="ceilometer-notification-agent" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.639227 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="proxy-httpd" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.639327 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" containerName="sg-core" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.641093 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.647624 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.647961 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.648489 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.659201 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.671955 4675 scope.go:117] "RemoveContainer" containerID="83a191ea65813943c763870c0cebf256c173bb5b3a0068521210e20a554684b4" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.697179 4675 scope.go:117] "RemoveContainer" containerID="e0c5591cddeb0434d1d8c09584451c7acdabc6ef73c7423de0b7479050701dcc" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721469 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-scripts\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721560 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-run-httpd\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721610 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721628 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721684 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhwpk\" (UniqueName: \"kubernetes.io/projected/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-kube-api-access-mhwpk\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721699 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-config-data\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721730 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-log-httpd\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.721745 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.822869 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhwpk\" (UniqueName: \"kubernetes.io/projected/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-kube-api-access-mhwpk\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.822919 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-config-data\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.822954 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-log-httpd\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.822972 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.823018 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-scripts\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.823078 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-run-httpd\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.823140 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.823163 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.823507 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-log-httpd\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.823624 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-run-httpd\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.828369 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.828573 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-scripts\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.830425 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-config-data\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.831409 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.848528 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.861779 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhwpk\" (UniqueName: \"kubernetes.io/projected/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-kube-api-access-mhwpk\") pod \"ceilometer-0\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " pod="openstack/ceilometer-0" Mar 20 16:24:49 crc kubenswrapper[4675]: I0320 16:24:49.977379 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:50 crc kubenswrapper[4675]: W0320 16:24:50.525911 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cb7e9a2_004e_4cc8_962a_939b2fb41f75.slice/crio-fcef7eae6aa81059b21bf54f9f0736749469eaa599375cc83d69fc0fa240bc7b WatchSource:0}: Error finding container fcef7eae6aa81059b21bf54f9f0736749469eaa599375cc83d69fc0fa240bc7b: Status 404 returned error can't find the container with id fcef7eae6aa81059b21bf54f9f0736749469eaa599375cc83d69fc0fa240bc7b Mar 20 16:24:50 crc kubenswrapper[4675]: I0320 16:24:50.532614 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:50 crc kubenswrapper[4675]: I0320 16:24:50.685066 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f846df-cb1b-4441-b985-95d4b76be0fa" path="/var/lib/kubelet/pods/79f846df-cb1b-4441-b985-95d4b76be0fa/volumes" Mar 20 16:24:51 crc kubenswrapper[4675]: I0320 16:24:51.537722 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerStarted","Data":"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18"} Mar 20 16:24:51 crc kubenswrapper[4675]: I0320 16:24:51.538020 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerStarted","Data":"fcef7eae6aa81059b21bf54f9f0736749469eaa599375cc83d69fc0fa240bc7b"} Mar 20 16:24:52 crc kubenswrapper[4675]: I0320 16:24:52.891496 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 16:24:52 crc kubenswrapper[4675]: I0320 16:24:52.903089 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:52 crc kubenswrapper[4675]: I0320 16:24:52.903146 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:52 crc kubenswrapper[4675]: I0320 16:24:52.935749 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 16:24:53 crc kubenswrapper[4675]: I0320 16:24:53.563194 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerStarted","Data":"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4"} Mar 20 16:24:53 crc kubenswrapper[4675]: I0320 16:24:53.604165 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 16:24:53 crc kubenswrapper[4675]: I0320 16:24:53.945371 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:53 crc kubenswrapper[4675]: I0320 16:24:53.945706 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:54 crc kubenswrapper[4675]: I0320 16:24:54.572555 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerStarted","Data":"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36"} Mar 20 16:24:54 crc kubenswrapper[4675]: I0320 16:24:54.740039 4675 scope.go:117] "RemoveContainer" containerID="a1b063b121dead7b9addc91d2381721786b2495c65eeff023e0eb4174823569b" Mar 20 16:24:54 crc kubenswrapper[4675]: I0320 16:24:54.927905 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 16:24:55 crc kubenswrapper[4675]: I0320 16:24:55.990276 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:24:55 crc kubenswrapper[4675]: I0320 16:24:55.992094 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:24:56 crc kubenswrapper[4675]: I0320 16:24:56.600536 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerStarted","Data":"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2"} Mar 20 16:24:56 crc kubenswrapper[4675]: I0320 16:24:56.633305 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.628399486 podStartE2EDuration="7.633285236s" podCreationTimestamp="2026-03-20 16:24:49 +0000 UTC" firstStartedPulling="2026-03-20 16:24:50.529339528 +0000 UTC m=+1410.562969065" lastFinishedPulling="2026-03-20 16:24:55.534225288 +0000 UTC m=+1415.567854815" observedRunningTime="2026-03-20 16:24:56.628922933 +0000 UTC m=+1416.662552490" watchObservedRunningTime="2026-03-20 16:24:56.633285236 +0000 UTC m=+1416.666914773" Mar 20 16:24:57 crc kubenswrapper[4675]: I0320 16:24:57.609784 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:24:57 crc kubenswrapper[4675]: I0320 16:24:57.996119 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:24:57 crc kubenswrapper[4675]: I0320 16:24:57.996642 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:24:58 crc kubenswrapper[4675]: I0320 16:24:58.002614 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:24:58 crc kubenswrapper[4675]: I0320 16:24:58.005333 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.489235 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.636138 4675 generic.go:334] "Generic (PLEG): container finished" podID="a94f8da9-3074-4101-a3ed-f1bddacf8ce5" containerID="ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d" exitCode=137 Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.636198 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.636220 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94f8da9-3074-4101-a3ed-f1bddacf8ce5","Type":"ContainerDied","Data":"ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d"} Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.636728 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94f8da9-3074-4101-a3ed-f1bddacf8ce5","Type":"ContainerDied","Data":"9b4c2d450b42c011af14d71117cfcae21215158a247df3d18f19bb20b84dcf51"} Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.636758 4675 scope.go:117] "RemoveContainer" containerID="ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.640845 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-combined-ca-bundle\") pod \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.640898 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-config-data\") pod \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.640944 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgk8s\" (UniqueName: \"kubernetes.io/projected/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-kube-api-access-qgk8s\") pod \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\" (UID: \"a94f8da9-3074-4101-a3ed-f1bddacf8ce5\") " Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.647979 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-kube-api-access-qgk8s" (OuterVolumeSpecName: "kube-api-access-qgk8s") pod "a94f8da9-3074-4101-a3ed-f1bddacf8ce5" (UID: "a94f8da9-3074-4101-a3ed-f1bddacf8ce5"). InnerVolumeSpecName "kube-api-access-qgk8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.668148 4675 scope.go:117] "RemoveContainer" containerID="ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.668470 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a94f8da9-3074-4101-a3ed-f1bddacf8ce5" (UID: "a94f8da9-3074-4101-a3ed-f1bddacf8ce5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.676310 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-config-data" (OuterVolumeSpecName: "config-data") pod "a94f8da9-3074-4101-a3ed-f1bddacf8ce5" (UID: "a94f8da9-3074-4101-a3ed-f1bddacf8ce5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:00 crc kubenswrapper[4675]: E0320 16:25:00.677779 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d\": container with ID starting with ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d not found: ID does not exist" containerID="ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.677926 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d"} err="failed to get container status \"ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d\": rpc error: code = NotFound desc = could not find container \"ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d\": container with ID starting with ed0d601a5c6adb9d99eab635139fe39b3eac02139ed04defba5946fd1173b89d not found: ID does not exist" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.742734 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.742794 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.742812 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgk8s\" (UniqueName: \"kubernetes.io/projected/a94f8da9-3074-4101-a3ed-f1bddacf8ce5-kube-api-access-qgk8s\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.903080 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.903164 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.963411 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.980396 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.988589 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:25:00 crc kubenswrapper[4675]: E0320 16:25:00.988987 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94f8da9-3074-4101-a3ed-f1bddacf8ce5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.989003 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94f8da9-3074-4101-a3ed-f1bddacf8ce5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.989206 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94f8da9-3074-4101-a3ed-f1bddacf8ce5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.989829 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.993338 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.993398 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 16:25:00 crc kubenswrapper[4675]: I0320 16:25:00.993398 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.002524 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.149300 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.149383 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgslb\" (UniqueName: \"kubernetes.io/projected/a3395380-db60-4ec1-9526-0a0796b45d73-kube-api-access-dgslb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.149411 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.149436 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.149474 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.251527 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.251603 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgslb\" (UniqueName: \"kubernetes.io/projected/a3395380-db60-4ec1-9526-0a0796b45d73-kube-api-access-dgslb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.251632 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.251653 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.251688 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.262472 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.262545 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.262627 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.262751 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3395380-db60-4ec1-9526-0a0796b45d73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.274280 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgslb\" (UniqueName: \"kubernetes.io/projected/a3395380-db60-4ec1-9526-0a0796b45d73-kube-api-access-dgslb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3395380-db60-4ec1-9526-0a0796b45d73\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.319021 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:01 crc kubenswrapper[4675]: I0320 16:25:01.780466 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:25:01 crc kubenswrapper[4675]: W0320 16:25:01.782443 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3395380_db60_4ec1_9526_0a0796b45d73.slice/crio-417202fa780ce720693fa3d30efc84dad0cbd0662e9aa1444b66a614e78afdde WatchSource:0}: Error finding container 417202fa780ce720693fa3d30efc84dad0cbd0662e9aa1444b66a614e78afdde: Status 404 returned error can't find the container with id 417202fa780ce720693fa3d30efc84dad0cbd0662e9aa1444b66a614e78afdde Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.657400 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3395380-db60-4ec1-9526-0a0796b45d73","Type":"ContainerStarted","Data":"7a9b6bf57f00b88ec1852c563dc92928815e2d1cb1f8822a0811164ec5cd79d8"} Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.657738 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3395380-db60-4ec1-9526-0a0796b45d73","Type":"ContainerStarted","Data":"417202fa780ce720693fa3d30efc84dad0cbd0662e9aa1444b66a614e78afdde"} Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.684971 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.684951805 podStartE2EDuration="2.684951805s" podCreationTimestamp="2026-03-20 16:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:02.680060086 +0000 UTC m=+1422.713689623" watchObservedRunningTime="2026-03-20 16:25:02.684951805 +0000 UTC m=+1422.718581342" Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.700564 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94f8da9-3074-4101-a3ed-f1bddacf8ce5" path="/var/lib/kubelet/pods/a94f8da9-3074-4101-a3ed-f1bddacf8ce5/volumes" Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.907914 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.908530 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:25:02 crc kubenswrapper[4675]: I0320 16:25:02.911636 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:25:03 crc kubenswrapper[4675]: I0320 16:25:03.677591 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:25:03 crc kubenswrapper[4675]: I0320 16:25:03.845867 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-m255x"] Mar 20 16:25:03 crc kubenswrapper[4675]: I0320 16:25:03.847657 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:03 crc kubenswrapper[4675]: I0320 16:25:03.888724 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-m255x"] Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.004598 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.005013 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.005120 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.005184 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.005207 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4vc\" (UniqueName: \"kubernetes.io/projected/e14e9637-edff-450c-ad95-6c0367ee120d-kube-api-access-6f4vc\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.005236 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-config\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.107072 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-config\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.107170 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.107196 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.107282 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.107331 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.107349 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4vc\" (UniqueName: \"kubernetes.io/projected/e14e9637-edff-450c-ad95-6c0367ee120d-kube-api-access-6f4vc\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.108498 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-config\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.109078 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.109676 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.110259 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.110892 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e14e9637-edff-450c-ad95-6c0367ee120d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.126357 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4vc\" (UniqueName: \"kubernetes.io/projected/e14e9637-edff-450c-ad95-6c0367ee120d-kube-api-access-6f4vc\") pod \"dnsmasq-dns-59cf4bdb65-m255x\" (UID: \"e14e9637-edff-450c-ad95-6c0367ee120d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.177211 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:04 crc kubenswrapper[4675]: I0320 16:25:04.682326 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-m255x"] Mar 20 16:25:04 crc kubenswrapper[4675]: W0320 16:25:04.684618 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14e9637_edff_450c_ad95_6c0367ee120d.slice/crio-ab03815e3112538d2cd7c5ba2eb07ca3d95387e9a4d6583834e5fd9dac3966e8 WatchSource:0}: Error finding container ab03815e3112538d2cd7c5ba2eb07ca3d95387e9a4d6583834e5fd9dac3966e8: Status 404 returned error can't find the container with id ab03815e3112538d2cd7c5ba2eb07ca3d95387e9a4d6583834e5fd9dac3966e8 Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.685222 4675 generic.go:334] "Generic (PLEG): container finished" podID="e14e9637-edff-450c-ad95-6c0367ee120d" containerID="e4dc210e3eeae473ec9996447a37b3a75782be058c5dcf98d9a547a4637c2832" exitCode=0 Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.685272 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" event={"ID":"e14e9637-edff-450c-ad95-6c0367ee120d","Type":"ContainerDied","Data":"e4dc210e3eeae473ec9996447a37b3a75782be058c5dcf98d9a547a4637c2832"} Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.685662 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" event={"ID":"e14e9637-edff-450c-ad95-6c0367ee120d","Type":"ContainerStarted","Data":"ab03815e3112538d2cd7c5ba2eb07ca3d95387e9a4d6583834e5fd9dac3966e8"} Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.699734 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.701068 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-central-agent" containerID="cri-o://f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" gracePeriod=30 Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.701226 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="sg-core" containerID="cri-o://a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" gracePeriod=30 Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.701363 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-notification-agent" containerID="cri-o://47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" gracePeriod=30 Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.701371 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="proxy-httpd" containerID="cri-o://d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" gracePeriod=30 Mar 20 16:25:05 crc kubenswrapper[4675]: I0320 16:25:05.803491 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": read tcp 10.217.0.2:35730->10.217.0.206:3000: read: connection reset by peer" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.246126 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.319490 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.530480 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655414 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-config-data\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655611 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-combined-ca-bundle\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655651 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-scripts\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655710 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-ceilometer-tls-certs\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655830 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-sg-core-conf-yaml\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655903 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhwpk\" (UniqueName: \"kubernetes.io/projected/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-kube-api-access-mhwpk\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655929 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-run-httpd\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.655952 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-log-httpd\") pod \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\" (UID: \"0cb7e9a2-004e-4cc8-962a-939b2fb41f75\") " Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.656862 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.657242 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.657648 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.662561 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-kube-api-access-mhwpk" (OuterVolumeSpecName: "kube-api-access-mhwpk") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "kube-api-access-mhwpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.668252 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-scripts" (OuterVolumeSpecName: "scripts") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.704173 4675 generic.go:334] "Generic (PLEG): container finished" podID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" exitCode=0 Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.704221 4675 generic.go:334] "Generic (PLEG): container finished" podID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" exitCode=2 Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.704246 4675 generic.go:334] "Generic (PLEG): container finished" podID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" exitCode=0 Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.704259 4675 generic.go:334] "Generic (PLEG): container finished" podID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" exitCode=0 Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.704490 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.707665 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-log" containerID="cri-o://a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29" gracePeriod=30 Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.711935 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-api" containerID="cri-o://a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175" gracePeriod=30 Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.744928 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.754992 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.758536 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.758571 4675 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.758583 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.758592 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhwpk\" (UniqueName: \"kubernetes.io/projected/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-kube-api-access-mhwpk\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.758600 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.767144 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773678 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773711 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerDied","Data":"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2"} Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773737 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerDied","Data":"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36"} Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773749 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerDied","Data":"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4"} Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773757 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerDied","Data":"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18"} Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773781 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb7e9a2-004e-4cc8-962a-939b2fb41f75","Type":"ContainerDied","Data":"fcef7eae6aa81059b21bf54f9f0736749469eaa599375cc83d69fc0fa240bc7b"} Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773790 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" event={"ID":"e14e9637-edff-450c-ad95-6c0367ee120d","Type":"ContainerStarted","Data":"f40153d3274269564ffb1466cbffb00c536ab3380f6b99b4ee1b7048c02836a9"} Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.773807 4675 scope.go:117] "RemoveContainer" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.792728 4675 scope.go:117] "RemoveContainer" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.813910 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-config-data" (OuterVolumeSpecName: "config-data") pod "0cb7e9a2-004e-4cc8-962a-939b2fb41f75" (UID: "0cb7e9a2-004e-4cc8-962a-939b2fb41f75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.820242 4675 scope.go:117] "RemoveContainer" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.846684 4675 scope.go:117] "RemoveContainer" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.860737 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.860800 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb7e9a2-004e-4cc8-962a-939b2fb41f75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.869724 4675 scope.go:117] "RemoveContainer" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" Mar 20 16:25:06 crc kubenswrapper[4675]: E0320 16:25:06.870266 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": container with ID starting with d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2 not found: ID does not exist" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.870305 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2"} err="failed to get container status \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": rpc error: code = NotFound desc = could not find container \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": container with ID starting with d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.870332 4675 scope.go:117] "RemoveContainer" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" Mar 20 16:25:06 crc kubenswrapper[4675]: E0320 16:25:06.870749 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": container with ID starting with a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36 not found: ID does not exist" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.870882 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36"} err="failed to get container status \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": rpc error: code = NotFound desc = could not find container \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": container with ID starting with a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.870908 4675 scope.go:117] "RemoveContainer" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" Mar 20 16:25:06 crc kubenswrapper[4675]: E0320 16:25:06.871298 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": container with ID starting with 47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4 not found: ID does not exist" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.871320 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4"} err="failed to get container status \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": rpc error: code = NotFound desc = could not find container \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": container with ID starting with 47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.871340 4675 scope.go:117] "RemoveContainer" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" Mar 20 16:25:06 crc kubenswrapper[4675]: E0320 16:25:06.871558 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": container with ID starting with f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18 not found: ID does not exist" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.871580 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18"} err="failed to get container status \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": rpc error: code = NotFound desc = could not find container \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": container with ID starting with f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.871593 4675 scope.go:117] "RemoveContainer" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.871835 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2"} err="failed to get container status \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": rpc error: code = NotFound desc = could not find container \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": container with ID starting with d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.871862 4675 scope.go:117] "RemoveContainer" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.872054 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36"} err="failed to get container status \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": rpc error: code = NotFound desc = could not find container \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": container with ID starting with a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.872073 4675 scope.go:117] "RemoveContainer" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.872303 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4"} err="failed to get container status \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": rpc error: code = NotFound desc = could not find container \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": container with ID starting with 47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.872321 4675 scope.go:117] "RemoveContainer" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.872510 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18"} err="failed to get container status \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": rpc error: code = NotFound desc = could not find container \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": container with ID starting with f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.872540 4675 scope.go:117] "RemoveContainer" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.873431 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2"} err="failed to get container status \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": rpc error: code = NotFound desc = could not find container \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": container with ID starting with d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.873463 4675 scope.go:117] "RemoveContainer" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.873717 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36"} err="failed to get container status \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": rpc error: code = NotFound desc = could not find container \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": container with ID starting with a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.873741 4675 scope.go:117] "RemoveContainer" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.874005 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4"} err="failed to get container status \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": rpc error: code = NotFound desc = could not find container \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": container with ID starting with 47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.874029 4675 scope.go:117] "RemoveContainer" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.874278 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18"} err="failed to get container status \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": rpc error: code = NotFound desc = could not find container \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": container with ID starting with f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.874303 4675 scope.go:117] "RemoveContainer" containerID="d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.874617 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2"} err="failed to get container status \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": rpc error: code = NotFound desc = could not find container \"d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2\": container with ID starting with d7cb81e40e0090ca5b23924525978e643727960f0ba7a1bcf7890bf9d175b7f2 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.874644 4675 scope.go:117] "RemoveContainer" containerID="a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.875241 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36"} err="failed to get container status \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": rpc error: code = NotFound desc = could not find container \"a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36\": container with ID starting with a7a31e562c40799cc69443045bd6cf9f987fb8c45b877ca62e1df7e470bcbb36 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.875355 4675 scope.go:117] "RemoveContainer" containerID="47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.875652 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4"} err="failed to get container status \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": rpc error: code = NotFound desc = could not find container \"47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4\": container with ID starting with 47abc4710b0daa7551a27490277ade9319e60e4710f991e21bb8bbd8d141ede4 not found: ID does not exist" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.875677 4675 scope.go:117] "RemoveContainer" containerID="f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18" Mar 20 16:25:06 crc kubenswrapper[4675]: I0320 16:25:06.877007 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18"} err="failed to get container status \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": rpc error: code = NotFound desc = could not find container \"f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18\": container with ID starting with f569ee7a97e2f6581a925ba64f8de1409ec4ab4618de1219d8bbee2e875afc18 not found: ID does not exist" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.047846 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" podStartSLOduration=4.047820684 podStartE2EDuration="4.047820684s" podCreationTimestamp="2026-03-20 16:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:06.752440368 +0000 UTC m=+1426.786069905" watchObservedRunningTime="2026-03-20 16:25:07.047820684 +0000 UTC m=+1427.081450221" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.057853 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.079205 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.100987 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:07 crc kubenswrapper[4675]: E0320 16:25:07.101366 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="sg-core" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101383 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="sg-core" Mar 20 16:25:07 crc kubenswrapper[4675]: E0320 16:25:07.101408 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-notification-agent" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101415 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-notification-agent" Mar 20 16:25:07 crc kubenswrapper[4675]: E0320 16:25:07.101432 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-central-agent" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101439 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-central-agent" Mar 20 16:25:07 crc kubenswrapper[4675]: E0320 16:25:07.101449 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="proxy-httpd" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101455 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="proxy-httpd" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101613 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="sg-core" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101633 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="proxy-httpd" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101650 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-notification-agent" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.101661 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" containerName="ceilometer-central-agent" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.103272 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.112861 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.116393 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.116471 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.117300 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.267359 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-scripts\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.267657 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-log-httpd\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.267826 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.267934 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-config-data\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.268066 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-run-httpd\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.268188 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.268318 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.268468 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q96fj\" (UniqueName: \"kubernetes.io/projected/8843c33f-e611-429b-969b-cdb8883c74bc-kube-api-access-q96fj\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.370252 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-scripts\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.370367 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-log-httpd\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.370476 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.370504 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-config-data\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.370524 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-run-httpd\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.370971 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-log-httpd\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.371040 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.371125 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-run-httpd\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.371183 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.371645 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q96fj\" (UniqueName: \"kubernetes.io/projected/8843c33f-e611-429b-969b-cdb8883c74bc-kube-api-access-q96fj\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.375837 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.375879 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-scripts\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.377358 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-config-data\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.378101 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.379144 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.388730 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q96fj\" (UniqueName: \"kubernetes.io/projected/8843c33f-e611-429b-969b-cdb8883c74bc-kube-api-access-q96fj\") pod \"ceilometer-0\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.449494 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.662090 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.718689 4675 generic.go:334] "Generic (PLEG): container finished" podID="070526b2-1aa9-4114-bf6a-327317631cf5" containerID="a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29" exitCode=143 Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.718745 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"070526b2-1aa9-4114-bf6a-327317631cf5","Type":"ContainerDied","Data":"a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29"} Mar 20 16:25:07 crc kubenswrapper[4675]: I0320 16:25:07.943473 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:07 crc kubenswrapper[4675]: W0320 16:25:07.947642 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8843c33f_e611_429b_969b_cdb8883c74bc.slice/crio-0eca789f3c91a2e31c93d330d83d98ae39b4ae13c44a5b15afaa09a5fa1dcdfc WatchSource:0}: Error finding container 0eca789f3c91a2e31c93d330d83d98ae39b4ae13c44a5b15afaa09a5fa1dcdfc: Status 404 returned error can't find the container with id 0eca789f3c91a2e31c93d330d83d98ae39b4ae13c44a5b15afaa09a5fa1dcdfc Mar 20 16:25:08 crc kubenswrapper[4675]: I0320 16:25:08.683675 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb7e9a2-004e-4cc8-962a-939b2fb41f75" path="/var/lib/kubelet/pods/0cb7e9a2-004e-4cc8-962a-939b2fb41f75/volumes" Mar 20 16:25:08 crc kubenswrapper[4675]: I0320 16:25:08.737137 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerStarted","Data":"ada7ba2162a6e798a495f93e6fe37af99800ac108c88fcde02f956f56c55f91b"} Mar 20 16:25:08 crc kubenswrapper[4675]: I0320 16:25:08.737194 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerStarted","Data":"0eca789f3c91a2e31c93d330d83d98ae39b4ae13c44a5b15afaa09a5fa1dcdfc"} Mar 20 16:25:09 crc kubenswrapper[4675]: I0320 16:25:09.751734 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerStarted","Data":"c2f1468059e8174d62f1387a368b5742bd91d55c9e02756c2b1915074ea029d8"} Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.248125 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.332446 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfnpk\" (UniqueName: \"kubernetes.io/projected/070526b2-1aa9-4114-bf6a-327317631cf5-kube-api-access-cfnpk\") pod \"070526b2-1aa9-4114-bf6a-327317631cf5\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.332671 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-combined-ca-bundle\") pod \"070526b2-1aa9-4114-bf6a-327317631cf5\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.332723 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-config-data\") pod \"070526b2-1aa9-4114-bf6a-327317631cf5\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.332827 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070526b2-1aa9-4114-bf6a-327317631cf5-logs\") pod \"070526b2-1aa9-4114-bf6a-327317631cf5\" (UID: \"070526b2-1aa9-4114-bf6a-327317631cf5\") " Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.333921 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070526b2-1aa9-4114-bf6a-327317631cf5-logs" (OuterVolumeSpecName: "logs") pod "070526b2-1aa9-4114-bf6a-327317631cf5" (UID: "070526b2-1aa9-4114-bf6a-327317631cf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.340972 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/070526b2-1aa9-4114-bf6a-327317631cf5-kube-api-access-cfnpk" (OuterVolumeSpecName: "kube-api-access-cfnpk") pod "070526b2-1aa9-4114-bf6a-327317631cf5" (UID: "070526b2-1aa9-4114-bf6a-327317631cf5"). InnerVolumeSpecName "kube-api-access-cfnpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.390331 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-config-data" (OuterVolumeSpecName: "config-data") pod "070526b2-1aa9-4114-bf6a-327317631cf5" (UID: "070526b2-1aa9-4114-bf6a-327317631cf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.400095 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "070526b2-1aa9-4114-bf6a-327317631cf5" (UID: "070526b2-1aa9-4114-bf6a-327317631cf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.435645 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.435688 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070526b2-1aa9-4114-bf6a-327317631cf5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.435700 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/070526b2-1aa9-4114-bf6a-327317631cf5-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.435712 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfnpk\" (UniqueName: \"kubernetes.io/projected/070526b2-1aa9-4114-bf6a-327317631cf5-kube-api-access-cfnpk\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.763736 4675 generic.go:334] "Generic (PLEG): container finished" podID="070526b2-1aa9-4114-bf6a-327317631cf5" containerID="a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175" exitCode=0 Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.763815 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"070526b2-1aa9-4114-bf6a-327317631cf5","Type":"ContainerDied","Data":"a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175"} Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.763832 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.763896 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"070526b2-1aa9-4114-bf6a-327317631cf5","Type":"ContainerDied","Data":"c22843dd84fc8aaf7fcd68d4ada1f69fce82de463aea0c268983f26c778afd52"} Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.763927 4675 scope.go:117] "RemoveContainer" containerID="a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.768666 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerStarted","Data":"164a1ac489e08c78060220608e53c3adc593ec270e34d9810845727dbf5269e0"} Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.792751 4675 scope.go:117] "RemoveContainer" containerID="a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.794828 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.804642 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.828218 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:10 crc kubenswrapper[4675]: E0320 16:25:10.828681 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-api" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.828697 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-api" Mar 20 16:25:10 crc kubenswrapper[4675]: E0320 16:25:10.828725 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-log" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.828734 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-log" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.828946 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-api" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.828966 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" containerName="nova-api-log" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.830594 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.833594 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.833666 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.834420 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.838329 4675 scope.go:117] "RemoveContainer" containerID="a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175" Mar 20 16:25:10 crc kubenswrapper[4675]: E0320 16:25:10.840028 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175\": container with ID starting with a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175 not found: ID does not exist" containerID="a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.840059 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175"} err="failed to get container status \"a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175\": rpc error: code = NotFound desc = could not find container \"a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175\": container with ID starting with a4d54acecc62482276579bbe1ce2352417111f42003c73732abf39e328970175 not found: ID does not exist" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.840079 4675 scope.go:117] "RemoveContainer" containerID="a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29" Mar 20 16:25:10 crc kubenswrapper[4675]: E0320 16:25:10.840615 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29\": container with ID starting with a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29 not found: ID does not exist" containerID="a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.840669 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29"} err="failed to get container status \"a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29\": rpc error: code = NotFound desc = could not find container \"a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29\": container with ID starting with a5b6b0928dfc7f92f022c271e08fdbda752d7277b6701eeb8f9cce32db4b0c29 not found: ID does not exist" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.860412 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.943573 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88c5d32-527c-494a-8721-50fe2b3c717b-logs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.943619 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-public-tls-certs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.943709 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wjkc\" (UniqueName: \"kubernetes.io/projected/c88c5d32-527c-494a-8721-50fe2b3c717b-kube-api-access-9wjkc\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.943743 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.943757 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-config-data\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:10 crc kubenswrapper[4675]: I0320 16:25:10.943841 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.045396 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.045883 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88c5d32-527c-494a-8721-50fe2b3c717b-logs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.045986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-public-tls-certs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.046091 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wjkc\" (UniqueName: \"kubernetes.io/projected/c88c5d32-527c-494a-8721-50fe2b3c717b-kube-api-access-9wjkc\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.046185 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.046248 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-config-data\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.046454 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88c5d32-527c-494a-8721-50fe2b3c717b-logs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.050639 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.050646 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-public-tls-certs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.050943 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.059427 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-config-data\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.073415 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wjkc\" (UniqueName: \"kubernetes.io/projected/c88c5d32-527c-494a-8721-50fe2b3c717b-kube-api-access-9wjkc\") pod \"nova-api-0\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.149956 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.319920 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.353171 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.637503 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.787353 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c88c5d32-527c-494a-8721-50fe2b3c717b","Type":"ContainerStarted","Data":"accdaa72dc1f6939ddd2e113c00e1ed8189b48d4d2b2ef1a08fe1799715c3dcf"} Mar 20 16:25:11 crc kubenswrapper[4675]: I0320 16:25:11.805844 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.014724 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-t9dln"] Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.017011 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.019382 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.020910 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.054231 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t9dln"] Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.174854 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szg98\" (UniqueName: \"kubernetes.io/projected/8094317b-8891-4a65-ac16-3211f0209ba4-kube-api-access-szg98\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.175020 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-config-data\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.175059 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-scripts\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.175082 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.277056 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szg98\" (UniqueName: \"kubernetes.io/projected/8094317b-8891-4a65-ac16-3211f0209ba4-kube-api-access-szg98\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.277184 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-config-data\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.277205 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-scripts\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.277222 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.281847 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-scripts\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.281915 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-config-data\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.282592 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.298778 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szg98\" (UniqueName: \"kubernetes.io/projected/8094317b-8891-4a65-ac16-3211f0209ba4-kube-api-access-szg98\") pod \"nova-cell1-cell-mapping-t9dln\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.346067 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.691960 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="070526b2-1aa9-4114-bf6a-327317631cf5" path="/var/lib/kubelet/pods/070526b2-1aa9-4114-bf6a-327317631cf5/volumes" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.796543 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-central-agent" containerID="cri-o://ada7ba2162a6e798a495f93e6fe37af99800ac108c88fcde02f956f56c55f91b" gracePeriod=30 Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.796843 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerStarted","Data":"0c8ae4b715416c15fca67bb4b80c2373fed733c878ef6efd9de39e94eee7e17f"} Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.796895 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.797210 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="proxy-httpd" containerID="cri-o://0c8ae4b715416c15fca67bb4b80c2373fed733c878ef6efd9de39e94eee7e17f" gracePeriod=30 Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.797258 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-notification-agent" containerID="cri-o://c2f1468059e8174d62f1387a368b5742bd91d55c9e02756c2b1915074ea029d8" gracePeriod=30 Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.797276 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="sg-core" containerID="cri-o://164a1ac489e08c78060220608e53c3adc593ec270e34d9810845727dbf5269e0" gracePeriod=30 Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.801430 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c88c5d32-527c-494a-8721-50fe2b3c717b","Type":"ContainerStarted","Data":"8dbbf01a8280d4cd9cf196f0776db7a4b4b6c03f73670cf8815315d9d3976046"} Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.801470 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c88c5d32-527c-494a-8721-50fe2b3c717b","Type":"ContainerStarted","Data":"9b22e460dd124e7e7f63a57ae8be4909fd8c7e3efbbac998a8c0360690e88f83"} Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.833732 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.161329909 podStartE2EDuration="5.833715207s" podCreationTimestamp="2026-03-20 16:25:07 +0000 UTC" firstStartedPulling="2026-03-20 16:25:07.950188928 +0000 UTC m=+1427.983818465" lastFinishedPulling="2026-03-20 16:25:11.622574226 +0000 UTC m=+1431.656203763" observedRunningTime="2026-03-20 16:25:12.826949356 +0000 UTC m=+1432.860578893" watchObservedRunningTime="2026-03-20 16:25:12.833715207 +0000 UTC m=+1432.867344744" Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.850273 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t9dln"] Mar 20 16:25:12 crc kubenswrapper[4675]: I0320 16:25:12.870692 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.870675514 podStartE2EDuration="2.870675514s" podCreationTimestamp="2026-03-20 16:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:12.85429101 +0000 UTC m=+1432.887920547" watchObservedRunningTime="2026-03-20 16:25:12.870675514 +0000 UTC m=+1432.904305051" Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.809717 4675 generic.go:334] "Generic (PLEG): container finished" podID="8843c33f-e611-429b-969b-cdb8883c74bc" containerID="0c8ae4b715416c15fca67bb4b80c2373fed733c878ef6efd9de39e94eee7e17f" exitCode=0 Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.810599 4675 generic.go:334] "Generic (PLEG): container finished" podID="8843c33f-e611-429b-969b-cdb8883c74bc" containerID="164a1ac489e08c78060220608e53c3adc593ec270e34d9810845727dbf5269e0" exitCode=2 Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.810661 4675 generic.go:334] "Generic (PLEG): container finished" podID="8843c33f-e611-429b-969b-cdb8883c74bc" containerID="c2f1468059e8174d62f1387a368b5742bd91d55c9e02756c2b1915074ea029d8" exitCode=0 Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.809799 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerDied","Data":"0c8ae4b715416c15fca67bb4b80c2373fed733c878ef6efd9de39e94eee7e17f"} Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.810883 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerDied","Data":"164a1ac489e08c78060220608e53c3adc593ec270e34d9810845727dbf5269e0"} Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.810971 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerDied","Data":"c2f1468059e8174d62f1387a368b5742bd91d55c9e02756c2b1915074ea029d8"} Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.812095 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t9dln" event={"ID":"8094317b-8891-4a65-ac16-3211f0209ba4","Type":"ContainerStarted","Data":"e627d3a3ea3368a314537f889754a1f200c2e501cdb41fc790010e7c2ae8f8d3"} Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.812124 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t9dln" event={"ID":"8094317b-8891-4a65-ac16-3211f0209ba4","Type":"ContainerStarted","Data":"6e60e5280bab9bf0e4574b0ca9038118b5251e4a6353b7a138e2e9f85ec7deb8"} Mar 20 16:25:13 crc kubenswrapper[4675]: I0320 16:25:13.829289 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-t9dln" podStartSLOduration=2.829267073 podStartE2EDuration="2.829267073s" podCreationTimestamp="2026-03-20 16:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:13.824706914 +0000 UTC m=+1433.858336451" watchObservedRunningTime="2026-03-20 16:25:13.829267073 +0000 UTC m=+1433.862896610" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.178966 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-m255x" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.247356 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4ntqw"] Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.247843 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerName="dnsmasq-dns" containerID="cri-o://14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9" gracePeriod=10 Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.803193 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.830897 4675 generic.go:334] "Generic (PLEG): container finished" podID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerID="14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9" exitCode=0 Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.830979 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.831001 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" event={"ID":"bb0ed576-df0b-45b0-ac22-4848f08bcbfb","Type":"ContainerDied","Data":"14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9"} Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.831079 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4ntqw" event={"ID":"bb0ed576-df0b-45b0-ac22-4848f08bcbfb","Type":"ContainerDied","Data":"48f7e89e1e3053a0954d0cd54cca116e177bc34a14ce0e15f9b1d90736ddc964"} Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.831129 4675 scope.go:117] "RemoveContainer" containerID="14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.857481 4675 scope.go:117] "RemoveContainer" containerID="19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.881706 4675 scope.go:117] "RemoveContainer" containerID="14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9" Mar 20 16:25:14 crc kubenswrapper[4675]: E0320 16:25:14.883142 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9\": container with ID starting with 14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9 not found: ID does not exist" containerID="14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.883170 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9"} err="failed to get container status \"14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9\": rpc error: code = NotFound desc = could not find container \"14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9\": container with ID starting with 14fdb0a1c697f0fb9e164ad0a8892617df58ad1c47b0931de02e387678daf0b9 not found: ID does not exist" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.883192 4675 scope.go:117] "RemoveContainer" containerID="19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637" Mar 20 16:25:14 crc kubenswrapper[4675]: E0320 16:25:14.884130 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637\": container with ID starting with 19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637 not found: ID does not exist" containerID="19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.884150 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637"} err="failed to get container status \"19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637\": rpc error: code = NotFound desc = could not find container \"19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637\": container with ID starting with 19f65cb8d1c5657218392515dccbb50dbd8cb9eee1af755e9681a40e7b05b637 not found: ID does not exist" Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.948954 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-nb\") pod \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.949012 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-svc\") pod \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.949099 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-sb\") pod \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.949175 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-config\") pod \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.949193 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-swift-storage-0\") pod \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.949218 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x956k\" (UniqueName: \"kubernetes.io/projected/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-kube-api-access-x956k\") pod \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\" (UID: \"bb0ed576-df0b-45b0-ac22-4848f08bcbfb\") " Mar 20 16:25:14 crc kubenswrapper[4675]: I0320 16:25:14.962077 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-kube-api-access-x956k" (OuterVolumeSpecName: "kube-api-access-x956k") pod "bb0ed576-df0b-45b0-ac22-4848f08bcbfb" (UID: "bb0ed576-df0b-45b0-ac22-4848f08bcbfb"). InnerVolumeSpecName "kube-api-access-x956k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.010606 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb0ed576-df0b-45b0-ac22-4848f08bcbfb" (UID: "bb0ed576-df0b-45b0-ac22-4848f08bcbfb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.014442 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb0ed576-df0b-45b0-ac22-4848f08bcbfb" (UID: "bb0ed576-df0b-45b0-ac22-4848f08bcbfb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.015298 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb0ed576-df0b-45b0-ac22-4848f08bcbfb" (UID: "bb0ed576-df0b-45b0-ac22-4848f08bcbfb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.025831 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-config" (OuterVolumeSpecName: "config") pod "bb0ed576-df0b-45b0-ac22-4848f08bcbfb" (UID: "bb0ed576-df0b-45b0-ac22-4848f08bcbfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.039142 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb0ed576-df0b-45b0-ac22-4848f08bcbfb" (UID: "bb0ed576-df0b-45b0-ac22-4848f08bcbfb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.051014 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.051050 4675 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.051065 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x956k\" (UniqueName: \"kubernetes.io/projected/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-kube-api-access-x956k\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.051078 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.051090 4675 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.051101 4675 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb0ed576-df0b-45b0-ac22-4848f08bcbfb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.172728 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4ntqw"] Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.182430 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4ntqw"] Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.847940 4675 generic.go:334] "Generic (PLEG): container finished" podID="8843c33f-e611-429b-969b-cdb8883c74bc" containerID="ada7ba2162a6e798a495f93e6fe37af99800ac108c88fcde02f956f56c55f91b" exitCode=0 Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.847981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerDied","Data":"ada7ba2162a6e798a495f93e6fe37af99800ac108c88fcde02f956f56c55f91b"} Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.848001 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8843c33f-e611-429b-969b-cdb8883c74bc","Type":"ContainerDied","Data":"0eca789f3c91a2e31c93d330d83d98ae39b4ae13c44a5b15afaa09a5fa1dcdfc"} Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.848012 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eca789f3c91a2e31c93d330d83d98ae39b4ae13c44a5b15afaa09a5fa1dcdfc" Mar 20 16:25:15 crc kubenswrapper[4675]: I0320 16:25:15.881897 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069193 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-config-data\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069326 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-sg-core-conf-yaml\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069364 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-ceilometer-tls-certs\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069449 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-scripts\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069465 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-combined-ca-bundle\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069488 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-log-httpd\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069530 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q96fj\" (UniqueName: \"kubernetes.io/projected/8843c33f-e611-429b-969b-cdb8883c74bc-kube-api-access-q96fj\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.069549 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-run-httpd\") pod \"8843c33f-e611-429b-969b-cdb8883c74bc\" (UID: \"8843c33f-e611-429b-969b-cdb8883c74bc\") " Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.070062 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.070097 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.074491 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-scripts" (OuterVolumeSpecName: "scripts") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.074932 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8843c33f-e611-429b-969b-cdb8883c74bc-kube-api-access-q96fj" (OuterVolumeSpecName: "kube-api-access-q96fj") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "kube-api-access-q96fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.107392 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.156994 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.171971 4675 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.172248 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.172258 4675 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.172267 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q96fj\" (UniqueName: \"kubernetes.io/projected/8843c33f-e611-429b-969b-cdb8883c74bc-kube-api-access-q96fj\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.172276 4675 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8843c33f-e611-429b-969b-cdb8883c74bc-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.172284 4675 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.188397 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.225530 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-config-data" (OuterVolumeSpecName: "config-data") pod "8843c33f-e611-429b-969b-cdb8883c74bc" (UID: "8843c33f-e611-429b-969b-cdb8883c74bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.273967 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.274006 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8843c33f-e611-429b-969b-cdb8883c74bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.686120 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" path="/var/lib/kubelet/pods/bb0ed576-df0b-45b0-ac22-4848f08bcbfb/volumes" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.857970 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.890081 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.900894 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.913055 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:16 crc kubenswrapper[4675]: E0320 16:25:16.913522 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-central-agent" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.913542 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-central-agent" Mar 20 16:25:16 crc kubenswrapper[4675]: E0320 16:25:16.913559 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerName="init" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.913568 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerName="init" Mar 20 16:25:16 crc kubenswrapper[4675]: E0320 16:25:16.913607 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-notification-agent" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.913616 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-notification-agent" Mar 20 16:25:16 crc kubenswrapper[4675]: E0320 16:25:16.916390 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerName="dnsmasq-dns" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916416 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerName="dnsmasq-dns" Mar 20 16:25:16 crc kubenswrapper[4675]: E0320 16:25:16.916432 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="sg-core" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916440 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="sg-core" Mar 20 16:25:16 crc kubenswrapper[4675]: E0320 16:25:16.916462 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="proxy-httpd" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916470 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="proxy-httpd" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916852 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-notification-agent" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916889 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="ceilometer-central-agent" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916904 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0ed576-df0b-45b0-ac22-4848f08bcbfb" containerName="dnsmasq-dns" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916921 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="sg-core" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.916937 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" containerName="proxy-httpd" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.919097 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.925052 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.925395 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.925538 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:25:16 crc kubenswrapper[4675]: I0320 16:25:16.935099 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089066 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxx4r\" (UniqueName: \"kubernetes.io/projected/179982ec-516c-43ea-b479-9e9309759410-kube-api-access-vxx4r\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089158 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089187 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/179982ec-516c-43ea-b479-9e9309759410-run-httpd\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089246 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/179982ec-516c-43ea-b479-9e9309759410-log-httpd\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089286 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089310 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-scripts\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089328 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-config-data\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.089388 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190429 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190477 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-scripts\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190497 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-config-data\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190580 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxx4r\" (UniqueName: \"kubernetes.io/projected/179982ec-516c-43ea-b479-9e9309759410-kube-api-access-vxx4r\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190612 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190635 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/179982ec-516c-43ea-b479-9e9309759410-run-httpd\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.190684 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/179982ec-516c-43ea-b479-9e9309759410-log-httpd\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.191068 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/179982ec-516c-43ea-b479-9e9309759410-log-httpd\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.193065 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/179982ec-516c-43ea-b479-9e9309759410-run-httpd\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.196264 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.196940 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.202870 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-config-data\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.203130 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-scripts\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.203817 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/179982ec-516c-43ea-b479-9e9309759410-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.216786 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxx4r\" (UniqueName: \"kubernetes.io/projected/179982ec-516c-43ea-b479-9e9309759410-kube-api-access-vxx4r\") pod \"ceilometer-0\" (UID: \"179982ec-516c-43ea-b479-9e9309759410\") " pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.253298 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.747805 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:25:17 crc kubenswrapper[4675]: W0320 16:25:17.787324 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179982ec_516c_43ea_b479_9e9309759410.slice/crio-073133759d8cda73a407148900da19bff6d1b22bbab242a1cfcf776cc5d110d5 WatchSource:0}: Error finding container 073133759d8cda73a407148900da19bff6d1b22bbab242a1cfcf776cc5d110d5: Status 404 returned error can't find the container with id 073133759d8cda73a407148900da19bff6d1b22bbab242a1cfcf776cc5d110d5 Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.869379 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"179982ec-516c-43ea-b479-9e9309759410","Type":"ContainerStarted","Data":"073133759d8cda73a407148900da19bff6d1b22bbab242a1cfcf776cc5d110d5"} Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.871536 4675 generic.go:334] "Generic (PLEG): container finished" podID="8094317b-8891-4a65-ac16-3211f0209ba4" containerID="e627d3a3ea3368a314537f889754a1f200c2e501cdb41fc790010e7c2ae8f8d3" exitCode=0 Mar 20 16:25:17 crc kubenswrapper[4675]: I0320 16:25:17.871624 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t9dln" event={"ID":"8094317b-8891-4a65-ac16-3211f0209ba4","Type":"ContainerDied","Data":"e627d3a3ea3368a314537f889754a1f200c2e501cdb41fc790010e7c2ae8f8d3"} Mar 20 16:25:18 crc kubenswrapper[4675]: I0320 16:25:18.687261 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8843c33f-e611-429b-969b-cdb8883c74bc" path="/var/lib/kubelet/pods/8843c33f-e611-429b-969b-cdb8883c74bc/volumes" Mar 20 16:25:18 crc kubenswrapper[4675]: I0320 16:25:18.882417 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"179982ec-516c-43ea-b479-9e9309759410","Type":"ContainerStarted","Data":"34481b0d952f0bcc258bf5e744d9b143577f5ed63d39aaf25bfe6117756ef0ff"} Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.276134 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.434825 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szg98\" (UniqueName: \"kubernetes.io/projected/8094317b-8891-4a65-ac16-3211f0209ba4-kube-api-access-szg98\") pod \"8094317b-8891-4a65-ac16-3211f0209ba4\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.435375 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-config-data\") pod \"8094317b-8891-4a65-ac16-3211f0209ba4\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.435982 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-combined-ca-bundle\") pod \"8094317b-8891-4a65-ac16-3211f0209ba4\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.436049 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-scripts\") pod \"8094317b-8891-4a65-ac16-3211f0209ba4\" (UID: \"8094317b-8891-4a65-ac16-3211f0209ba4\") " Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.441353 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8094317b-8891-4a65-ac16-3211f0209ba4-kube-api-access-szg98" (OuterVolumeSpecName: "kube-api-access-szg98") pod "8094317b-8891-4a65-ac16-3211f0209ba4" (UID: "8094317b-8891-4a65-ac16-3211f0209ba4"). InnerVolumeSpecName "kube-api-access-szg98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.442409 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-scripts" (OuterVolumeSpecName: "scripts") pod "8094317b-8891-4a65-ac16-3211f0209ba4" (UID: "8094317b-8891-4a65-ac16-3211f0209ba4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.462774 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8094317b-8891-4a65-ac16-3211f0209ba4" (UID: "8094317b-8891-4a65-ac16-3211f0209ba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.467670 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-config-data" (OuterVolumeSpecName: "config-data") pod "8094317b-8891-4a65-ac16-3211f0209ba4" (UID: "8094317b-8891-4a65-ac16-3211f0209ba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.537998 4675 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.538036 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szg98\" (UniqueName: \"kubernetes.io/projected/8094317b-8891-4a65-ac16-3211f0209ba4-kube-api-access-szg98\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.538051 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.538059 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8094317b-8891-4a65-ac16-3211f0209ba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.893654 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"179982ec-516c-43ea-b479-9e9309759410","Type":"ContainerStarted","Data":"b07c2fe39218d56989d97c55627d666a5654a53b9c7520dde6874aac56b7053b"} Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.893701 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"179982ec-516c-43ea-b479-9e9309759410","Type":"ContainerStarted","Data":"8bf196b950444d8922651eb9c3ef3a44bf9b38b758fafa8c5534a6a9c02df7d9"} Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.895615 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t9dln" event={"ID":"8094317b-8891-4a65-ac16-3211f0209ba4","Type":"ContainerDied","Data":"6e60e5280bab9bf0e4574b0ca9038118b5251e4a6353b7a138e2e9f85ec7deb8"} Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.895644 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e60e5280bab9bf0e4574b0ca9038118b5251e4a6353b7a138e2e9f85ec7deb8" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.895708 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t9dln" Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.990815 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:25:19 crc kubenswrapper[4675]: I0320 16:25:19.991135 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fcdab570-2836-4f45-98c8-974aab349015" containerName="nova-scheduler-scheduler" containerID="cri-o://24ded4262b64a4cb4e39d8797393d19c127c96aab7648815732be701ac90ba0c" gracePeriod=30 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.006225 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.006643 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-log" containerID="cri-o://9b22e460dd124e7e7f63a57ae8be4909fd8c7e3efbbac998a8c0360690e88f83" gracePeriod=30 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.006793 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-api" containerID="cri-o://8dbbf01a8280d4cd9cf196f0776db7a4b4b6c03f73670cf8815315d9d3976046" gracePeriod=30 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.030359 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.030876 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-log" containerID="cri-o://0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0" gracePeriod=30 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.031022 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-metadata" containerID="cri-o://65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81" gracePeriod=30 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.909369 4675 generic.go:334] "Generic (PLEG): container finished" podID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerID="0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0" exitCode=143 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.909557 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee16f9b2-f9aa-42c4-8e19-29da55f3c861","Type":"ContainerDied","Data":"0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0"} Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.914416 4675 generic.go:334] "Generic (PLEG): container finished" podID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerID="8dbbf01a8280d4cd9cf196f0776db7a4b4b6c03f73670cf8815315d9d3976046" exitCode=0 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.914451 4675 generic.go:334] "Generic (PLEG): container finished" podID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerID="9b22e460dd124e7e7f63a57ae8be4909fd8c7e3efbbac998a8c0360690e88f83" exitCode=143 Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.914474 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c88c5d32-527c-494a-8721-50fe2b3c717b","Type":"ContainerDied","Data":"8dbbf01a8280d4cd9cf196f0776db7a4b4b6c03f73670cf8815315d9d3976046"} Mar 20 16:25:20 crc kubenswrapper[4675]: I0320 16:25:20.914500 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c88c5d32-527c-494a-8721-50fe2b3c717b","Type":"ContainerDied","Data":"9b22e460dd124e7e7f63a57ae8be4909fd8c7e3efbbac998a8c0360690e88f83"} Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.063306 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.168815 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88c5d32-527c-494a-8721-50fe2b3c717b-logs\") pod \"c88c5d32-527c-494a-8721-50fe2b3c717b\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.168887 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-config-data\") pod \"c88c5d32-527c-494a-8721-50fe2b3c717b\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.168946 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-internal-tls-certs\") pod \"c88c5d32-527c-494a-8721-50fe2b3c717b\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.168982 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-combined-ca-bundle\") pod \"c88c5d32-527c-494a-8721-50fe2b3c717b\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.169000 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wjkc\" (UniqueName: \"kubernetes.io/projected/c88c5d32-527c-494a-8721-50fe2b3c717b-kube-api-access-9wjkc\") pod \"c88c5d32-527c-494a-8721-50fe2b3c717b\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.169105 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-public-tls-certs\") pod \"c88c5d32-527c-494a-8721-50fe2b3c717b\" (UID: \"c88c5d32-527c-494a-8721-50fe2b3c717b\") " Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.170265 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c88c5d32-527c-494a-8721-50fe2b3c717b-logs" (OuterVolumeSpecName: "logs") pod "c88c5d32-527c-494a-8721-50fe2b3c717b" (UID: "c88c5d32-527c-494a-8721-50fe2b3c717b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.175176 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88c5d32-527c-494a-8721-50fe2b3c717b-kube-api-access-9wjkc" (OuterVolumeSpecName: "kube-api-access-9wjkc") pod "c88c5d32-527c-494a-8721-50fe2b3c717b" (UID: "c88c5d32-527c-494a-8721-50fe2b3c717b"). InnerVolumeSpecName "kube-api-access-9wjkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.199856 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-config-data" (OuterVolumeSpecName: "config-data") pod "c88c5d32-527c-494a-8721-50fe2b3c717b" (UID: "c88c5d32-527c-494a-8721-50fe2b3c717b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.228512 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c88c5d32-527c-494a-8721-50fe2b3c717b" (UID: "c88c5d32-527c-494a-8721-50fe2b3c717b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.248443 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c88c5d32-527c-494a-8721-50fe2b3c717b" (UID: "c88c5d32-527c-494a-8721-50fe2b3c717b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.257285 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c88c5d32-527c-494a-8721-50fe2b3c717b" (UID: "c88c5d32-527c-494a-8721-50fe2b3c717b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.271514 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.271548 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wjkc\" (UniqueName: \"kubernetes.io/projected/c88c5d32-527c-494a-8721-50fe2b3c717b-kube-api-access-9wjkc\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.271562 4675 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.271570 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c88c5d32-527c-494a-8721-50fe2b3c717b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.271580 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.271589 4675 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c88c5d32-527c-494a-8721-50fe2b3c717b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.934234 4675 generic.go:334] "Generic (PLEG): container finished" podID="fcdab570-2836-4f45-98c8-974aab349015" containerID="24ded4262b64a4cb4e39d8797393d19c127c96aab7648815732be701ac90ba0c" exitCode=0 Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.934858 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdab570-2836-4f45-98c8-974aab349015","Type":"ContainerDied","Data":"24ded4262b64a4cb4e39d8797393d19c127c96aab7648815732be701ac90ba0c"} Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.938438 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"179982ec-516c-43ea-b479-9e9309759410","Type":"ContainerStarted","Data":"04d716b64bdf6e87bbd710d37b1e05e7f81e8122e9e0449d5d03aa05b9d481d3"} Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.939407 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.942151 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c88c5d32-527c-494a-8721-50fe2b3c717b","Type":"ContainerDied","Data":"accdaa72dc1f6939ddd2e113c00e1ed8189b48d4d2b2ef1a08fe1799715c3dcf"} Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.942230 4675 scope.go:117] "RemoveContainer" containerID="8dbbf01a8280d4cd9cf196f0776db7a4b4b6c03f73670cf8815315d9d3976046" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.942246 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.974544 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.583012871 podStartE2EDuration="5.974519906s" podCreationTimestamp="2026-03-20 16:25:16 +0000 UTC" firstStartedPulling="2026-03-20 16:25:17.790166155 +0000 UTC m=+1437.823795692" lastFinishedPulling="2026-03-20 16:25:21.18167319 +0000 UTC m=+1441.215302727" observedRunningTime="2026-03-20 16:25:21.965086099 +0000 UTC m=+1441.998715686" watchObservedRunningTime="2026-03-20 16:25:21.974519906 +0000 UTC m=+1442.008149453" Mar 20 16:25:21 crc kubenswrapper[4675]: I0320 16:25:21.977276 4675 scope.go:117] "RemoveContainer" containerID="9b22e460dd124e7e7f63a57ae8be4909fd8c7e3efbbac998a8c0360690e88f83" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.036115 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.047447 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081148 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:22 crc kubenswrapper[4675]: E0320 16:25:22.081504 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-log" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081523 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-log" Mar 20 16:25:22 crc kubenswrapper[4675]: E0320 16:25:22.081556 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-api" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081564 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-api" Mar 20 16:25:22 crc kubenswrapper[4675]: E0320 16:25:22.081582 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8094317b-8891-4a65-ac16-3211f0209ba4" containerName="nova-manage" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081587 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="8094317b-8891-4a65-ac16-3211f0209ba4" containerName="nova-manage" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081779 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-api" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081798 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" containerName="nova-api-log" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.081819 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="8094317b-8891-4a65-ac16-3211f0209ba4" containerName="nova-manage" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.082682 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.085063 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.085063 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.085289 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.108551 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.188151 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-kube-api-access-2hm4l\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.188246 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-public-tls-certs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.188287 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.188328 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-config-data\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.188394 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.188440 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-logs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.290440 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-public-tls-certs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.290558 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.290628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-config-data\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.290677 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.290738 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-logs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.290820 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-kube-api-access-2hm4l\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.294792 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-logs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.303636 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.303910 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-public-tls-certs\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.304571 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-config-data\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.312524 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.317188 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43-kube-api-access-2hm4l\") pod \"nova-api-0\" (UID: \"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43\") " pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.367499 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.421358 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.500445 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-combined-ca-bundle\") pod \"fcdab570-2836-4f45-98c8-974aab349015\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.500879 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mnq2\" (UniqueName: \"kubernetes.io/projected/fcdab570-2836-4f45-98c8-974aab349015-kube-api-access-6mnq2\") pod \"fcdab570-2836-4f45-98c8-974aab349015\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.500930 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-config-data\") pod \"fcdab570-2836-4f45-98c8-974aab349015\" (UID: \"fcdab570-2836-4f45-98c8-974aab349015\") " Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.507539 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcdab570-2836-4f45-98c8-974aab349015-kube-api-access-6mnq2" (OuterVolumeSpecName: "kube-api-access-6mnq2") pod "fcdab570-2836-4f45-98c8-974aab349015" (UID: "fcdab570-2836-4f45-98c8-974aab349015"). InnerVolumeSpecName "kube-api-access-6mnq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.546888 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcdab570-2836-4f45-98c8-974aab349015" (UID: "fcdab570-2836-4f45-98c8-974aab349015"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.549671 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-config-data" (OuterVolumeSpecName: "config-data") pod "fcdab570-2836-4f45-98c8-974aab349015" (UID: "fcdab570-2836-4f45-98c8-974aab349015"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.602653 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.602679 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mnq2\" (UniqueName: \"kubernetes.io/projected/fcdab570-2836-4f45-98c8-974aab349015-kube-api-access-6mnq2\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.602691 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdab570-2836-4f45-98c8-974aab349015-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.687109 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88c5d32-527c-494a-8721-50fe2b3c717b" path="/var/lib/kubelet/pods/c88c5d32-527c-494a-8721-50fe2b3c717b/volumes" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.900484 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.954902 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43","Type":"ContainerStarted","Data":"66c09858a241f49bc71d22487cadd415d0472e0697a17c103911fa5d20a17158"} Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.959400 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdab570-2836-4f45-98c8-974aab349015","Type":"ContainerDied","Data":"45bfa8e10ebf5064e93ad10ab407fd16672aade7c45816790e768e555aab85c8"} Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.959447 4675 scope.go:117] "RemoveContainer" containerID="24ded4262b64a4cb4e39d8797393d19c127c96aab7648815732be701ac90ba0c" Mar 20 16:25:22 crc kubenswrapper[4675]: I0320 16:25:22.959575 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.077933 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.090550 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.099392 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:25:23 crc kubenswrapper[4675]: E0320 16:25:23.099839 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdab570-2836-4f45-98c8-974aab349015" containerName="nova-scheduler-scheduler" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.099854 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdab570-2836-4f45-98c8-974aab349015" containerName="nova-scheduler-scheduler" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.100066 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcdab570-2836-4f45-98c8-974aab349015" containerName="nova-scheduler-scheduler" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.100643 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.104039 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.112971 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.213001 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.213124 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-config-data\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.213215 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wh5x\" (UniqueName: \"kubernetes.io/projected/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-kube-api-access-6wh5x\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.315105 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.315157 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-config-data\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.315207 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wh5x\" (UniqueName: \"kubernetes.io/projected/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-kube-api-access-6wh5x\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.320663 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.321025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-config-data\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.332791 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wh5x\" (UniqueName: \"kubernetes.io/projected/1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f-kube-api-access-6wh5x\") pod \"nova-scheduler-0\" (UID: \"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f\") " pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.426621 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.587122 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.723695 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-nova-metadata-tls-certs\") pod \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.724090 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-config-data\") pod \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.724118 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4m6w\" (UniqueName: \"kubernetes.io/projected/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-kube-api-access-l4m6w\") pod \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.724229 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-combined-ca-bundle\") pod \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.724273 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-logs\") pod \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\" (UID: \"ee16f9b2-f9aa-42c4-8e19-29da55f3c861\") " Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.724901 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-logs" (OuterVolumeSpecName: "logs") pod "ee16f9b2-f9aa-42c4-8e19-29da55f3c861" (UID: "ee16f9b2-f9aa-42c4-8e19-29da55f3c861"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.728574 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-kube-api-access-l4m6w" (OuterVolumeSpecName: "kube-api-access-l4m6w") pod "ee16f9b2-f9aa-42c4-8e19-29da55f3c861" (UID: "ee16f9b2-f9aa-42c4-8e19-29da55f3c861"). InnerVolumeSpecName "kube-api-access-l4m6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.748591 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-config-data" (OuterVolumeSpecName: "config-data") pod "ee16f9b2-f9aa-42c4-8e19-29da55f3c861" (UID: "ee16f9b2-f9aa-42c4-8e19-29da55f3c861"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.802882 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee16f9b2-f9aa-42c4-8e19-29da55f3c861" (UID: "ee16f9b2-f9aa-42c4-8e19-29da55f3c861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.811514 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ee16f9b2-f9aa-42c4-8e19-29da55f3c861" (UID: "ee16f9b2-f9aa-42c4-8e19-29da55f3c861"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.826332 4675 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.826365 4675 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.826376 4675 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.826388 4675 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.826396 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4m6w\" (UniqueName: \"kubernetes.io/projected/ee16f9b2-f9aa-42c4-8e19-29da55f3c861-kube-api-access-l4m6w\") on node \"crc\" DevicePath \"\"" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.906232 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.987484 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43","Type":"ContainerStarted","Data":"e2b3f02503c4d6820fd7ca28c8d06581dea4c35eac87ea9a9368cdef347a5cef"} Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.987539 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43","Type":"ContainerStarted","Data":"2f035d573b9488de6f2fed3c677f948bed4f2aa0d619c70683e9e4758891c962"} Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.993945 4675 generic.go:334] "Generic (PLEG): container finished" podID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerID="65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81" exitCode=0 Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.994000 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee16f9b2-f9aa-42c4-8e19-29da55f3c861","Type":"ContainerDied","Data":"65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81"} Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.994020 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee16f9b2-f9aa-42c4-8e19-29da55f3c861","Type":"ContainerDied","Data":"2e6b35eeccd82b4c61da48ec26acc8918a1f4ad00c36b799f8d5869233a05565"} Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.994036 4675 scope.go:117] "RemoveContainer" containerID="65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.994040 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:25:23 crc kubenswrapper[4675]: I0320 16:25:23.995755 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f","Type":"ContainerStarted","Data":"1bf90406a257e00e630bcf458beabfd4c2514b7d6e9558510df9d0fa25a7c86e"} Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.008355 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.008330878 podStartE2EDuration="2.008330878s" podCreationTimestamp="2026-03-20 16:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:24.003514271 +0000 UTC m=+1444.037143818" watchObservedRunningTime="2026-03-20 16:25:24.008330878 +0000 UTC m=+1444.041960415" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.086976 4675 scope.go:117] "RemoveContainer" containerID="0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.119526 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.119650 4675 scope.go:117] "RemoveContainer" containerID="65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81" Mar 20 16:25:24 crc kubenswrapper[4675]: E0320 16:25:24.120495 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81\": container with ID starting with 65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81 not found: ID does not exist" containerID="65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.120532 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81"} err="failed to get container status \"65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81\": rpc error: code = NotFound desc = could not find container \"65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81\": container with ID starting with 65bb51f6d44a5848b78180bd994c52d916ba8f754968310aaf51f1a28cd47c81 not found: ID does not exist" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.120554 4675 scope.go:117] "RemoveContainer" containerID="0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0" Mar 20 16:25:24 crc kubenswrapper[4675]: E0320 16:25:24.120886 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0\": container with ID starting with 0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0 not found: ID does not exist" containerID="0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.120919 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0"} err="failed to get container status \"0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0\": rpc error: code = NotFound desc = could not find container \"0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0\": container with ID starting with 0af8b6c7bd4d647c31bafab0dd577055b17809b48abea2af74951eec26e2a8f0 not found: ID does not exist" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.131756 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.147385 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:25:24 crc kubenswrapper[4675]: E0320 16:25:24.147859 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-log" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.147880 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-log" Mar 20 16:25:24 crc kubenswrapper[4675]: E0320 16:25:24.147916 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-metadata" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.147926 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-metadata" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.148158 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-metadata" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.148193 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" containerName="nova-metadata-log" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.149357 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.151591 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.151821 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.164008 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.240709 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmm45\" (UniqueName: \"kubernetes.io/projected/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-kube-api-access-nmm45\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.240802 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-config-data\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.240844 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-logs\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.240884 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.240922 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.342515 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmm45\" (UniqueName: \"kubernetes.io/projected/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-kube-api-access-nmm45\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.342575 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-config-data\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.342606 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-logs\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.342628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.342655 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.343410 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-logs\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.347236 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.348435 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-config-data\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.356540 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.359846 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmm45\" (UniqueName: \"kubernetes.io/projected/b34a59d7-ed22-4f2c-8214-0c69b352bbb1-kube-api-access-nmm45\") pod \"nova-metadata-0\" (UID: \"b34a59d7-ed22-4f2c-8214-0c69b352bbb1\") " pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.469396 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.690629 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee16f9b2-f9aa-42c4-8e19-29da55f3c861" path="/var/lib/kubelet/pods/ee16f9b2-f9aa-42c4-8e19-29da55f3c861/volumes" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.691945 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcdab570-2836-4f45-98c8-974aab349015" path="/var/lib/kubelet/pods/fcdab570-2836-4f45-98c8-974aab349015/volumes" Mar 20 16:25:24 crc kubenswrapper[4675]: I0320 16:25:24.896686 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:25:24 crc kubenswrapper[4675]: W0320 16:25:24.897551 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34a59d7_ed22_4f2c_8214_0c69b352bbb1.slice/crio-35f1cd3f742bf306929717f42dc78a50145c3c74cbc38863c318a6c2d1f1013a WatchSource:0}: Error finding container 35f1cd3f742bf306929717f42dc78a50145c3c74cbc38863c318a6c2d1f1013a: Status 404 returned error can't find the container with id 35f1cd3f742bf306929717f42dc78a50145c3c74cbc38863c318a6c2d1f1013a Mar 20 16:25:25 crc kubenswrapper[4675]: I0320 16:25:25.007849 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f","Type":"ContainerStarted","Data":"99d266c8901bc40b0b4e01977710819ea87db3f716b6cde5ff4bfa1dd7f2356f"} Mar 20 16:25:25 crc kubenswrapper[4675]: I0320 16:25:25.009118 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b34a59d7-ed22-4f2c-8214-0c69b352bbb1","Type":"ContainerStarted","Data":"35f1cd3f742bf306929717f42dc78a50145c3c74cbc38863c318a6c2d1f1013a"} Mar 20 16:25:25 crc kubenswrapper[4675]: I0320 16:25:25.030559 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.030542169 podStartE2EDuration="2.030542169s" podCreationTimestamp="2026-03-20 16:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:25.027295358 +0000 UTC m=+1445.060924895" watchObservedRunningTime="2026-03-20 16:25:25.030542169 +0000 UTC m=+1445.064171706" Mar 20 16:25:26 crc kubenswrapper[4675]: I0320 16:25:26.022150 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b34a59d7-ed22-4f2c-8214-0c69b352bbb1","Type":"ContainerStarted","Data":"894806cb35a5560052016dd7be27e9b08f661ebe7b84fad8cfddc13758eb8af6"} Mar 20 16:25:26 crc kubenswrapper[4675]: I0320 16:25:26.022792 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b34a59d7-ed22-4f2c-8214-0c69b352bbb1","Type":"ContainerStarted","Data":"6e85589c886ff61c56b7db48f79a381939bdc005aa080dd74412bdc9f75a20d6"} Mar 20 16:25:26 crc kubenswrapper[4675]: I0320 16:25:26.053258 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.053241115 podStartE2EDuration="2.053241115s" podCreationTimestamp="2026-03-20 16:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:25:26.047439091 +0000 UTC m=+1446.081068658" watchObservedRunningTime="2026-03-20 16:25:26.053241115 +0000 UTC m=+1446.086870652" Mar 20 16:25:28 crc kubenswrapper[4675]: I0320 16:25:28.428107 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 16:25:32 crc kubenswrapper[4675]: I0320 16:25:32.422631 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:25:32 crc kubenswrapper[4675]: I0320 16:25:32.424721 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:25:33 crc kubenswrapper[4675]: I0320 16:25:33.428434 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 16:25:33 crc kubenswrapper[4675]: I0320 16:25:33.436001 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:25:33 crc kubenswrapper[4675]: I0320 16:25:33.436001 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:25:33 crc kubenswrapper[4675]: I0320 16:25:33.463378 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 16:25:34 crc kubenswrapper[4675]: I0320 16:25:34.149670 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 16:25:34 crc kubenswrapper[4675]: I0320 16:25:34.470382 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:25:34 crc kubenswrapper[4675]: I0320 16:25:34.470426 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:25:35 crc kubenswrapper[4675]: I0320 16:25:35.507050 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b34a59d7-ed22-4f2c-8214-0c69b352bbb1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:25:35 crc kubenswrapper[4675]: I0320 16:25:35.507064 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b34a59d7-ed22-4f2c-8214-0c69b352bbb1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:25:40 crc kubenswrapper[4675]: I0320 16:25:40.422977 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:25:40 crc kubenswrapper[4675]: I0320 16:25:40.424725 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:25:42 crc kubenswrapper[4675]: I0320 16:25:42.434031 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:25:42 crc kubenswrapper[4675]: I0320 16:25:42.439508 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:25:42 crc kubenswrapper[4675]: I0320 16:25:42.446091 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:25:42 crc kubenswrapper[4675]: I0320 16:25:42.470496 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:25:42 crc kubenswrapper[4675]: I0320 16:25:42.470571 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:25:43 crc kubenswrapper[4675]: I0320 16:25:43.265836 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:25:44 crc kubenswrapper[4675]: I0320 16:25:44.474654 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:25:44 crc kubenswrapper[4675]: I0320 16:25:44.475404 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:25:44 crc kubenswrapper[4675]: I0320 16:25:44.483102 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:25:45 crc kubenswrapper[4675]: I0320 16:25:45.299302 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:25:47 crc kubenswrapper[4675]: I0320 16:25:47.269206 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.141237 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567066-2sgx8"] Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.143151 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.147064 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.147179 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.147271 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.154543 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-2sgx8"] Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.246218 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qvk\" (UniqueName: \"kubernetes.io/projected/68eaed96-a9c7-4876-9610-73f080a5372b-kube-api-access-l7qvk\") pod \"auto-csr-approver-29567066-2sgx8\" (UID: \"68eaed96-a9c7-4876-9610-73f080a5372b\") " pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.348189 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qvk\" (UniqueName: \"kubernetes.io/projected/68eaed96-a9c7-4876-9610-73f080a5372b-kube-api-access-l7qvk\") pod \"auto-csr-approver-29567066-2sgx8\" (UID: \"68eaed96-a9c7-4876-9610-73f080a5372b\") " pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.368707 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qvk\" (UniqueName: \"kubernetes.io/projected/68eaed96-a9c7-4876-9610-73f080a5372b-kube-api-access-l7qvk\") pod \"auto-csr-approver-29567066-2sgx8\" (UID: \"68eaed96-a9c7-4876-9610-73f080a5372b\") " pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.471254 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.935903 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-2sgx8"] Mar 20 16:26:00 crc kubenswrapper[4675]: I0320 16:26:00.947595 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:26:01 crc kubenswrapper[4675]: I0320 16:26:01.432819 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" event={"ID":"68eaed96-a9c7-4876-9610-73f080a5372b","Type":"ContainerStarted","Data":"c741b49c658673db743d30b6a0b53563065cdfbf37ec5557578c5a8f60a26e9e"} Mar 20 16:26:02 crc kubenswrapper[4675]: I0320 16:26:02.441282 4675 generic.go:334] "Generic (PLEG): container finished" podID="68eaed96-a9c7-4876-9610-73f080a5372b" containerID="2b04de1c1e94561ecd1290ffd74176754b284dccc1353d9a15e9fa353bdf39e6" exitCode=0 Mar 20 16:26:02 crc kubenswrapper[4675]: I0320 16:26:02.441321 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" event={"ID":"68eaed96-a9c7-4876-9610-73f080a5372b","Type":"ContainerDied","Data":"2b04de1c1e94561ecd1290ffd74176754b284dccc1353d9a15e9fa353bdf39e6"} Mar 20 16:26:03 crc kubenswrapper[4675]: I0320 16:26:03.771344 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:03 crc kubenswrapper[4675]: I0320 16:26:03.918261 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qvk\" (UniqueName: \"kubernetes.io/projected/68eaed96-a9c7-4876-9610-73f080a5372b-kube-api-access-l7qvk\") pod \"68eaed96-a9c7-4876-9610-73f080a5372b\" (UID: \"68eaed96-a9c7-4876-9610-73f080a5372b\") " Mar 20 16:26:03 crc kubenswrapper[4675]: I0320 16:26:03.930507 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68eaed96-a9c7-4876-9610-73f080a5372b-kube-api-access-l7qvk" (OuterVolumeSpecName: "kube-api-access-l7qvk") pod "68eaed96-a9c7-4876-9610-73f080a5372b" (UID: "68eaed96-a9c7-4876-9610-73f080a5372b"). InnerVolumeSpecName "kube-api-access-l7qvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:04 crc kubenswrapper[4675]: I0320 16:26:04.020118 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qvk\" (UniqueName: \"kubernetes.io/projected/68eaed96-a9c7-4876-9610-73f080a5372b-kube-api-access-l7qvk\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:04 crc kubenswrapper[4675]: I0320 16:26:04.460576 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" event={"ID":"68eaed96-a9c7-4876-9610-73f080a5372b","Type":"ContainerDied","Data":"c741b49c658673db743d30b6a0b53563065cdfbf37ec5557578c5a8f60a26e9e"} Mar 20 16:26:04 crc kubenswrapper[4675]: I0320 16:26:04.460637 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c741b49c658673db743d30b6a0b53563065cdfbf37ec5557578c5a8f60a26e9e" Mar 20 16:26:04 crc kubenswrapper[4675]: I0320 16:26:04.460666 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-2sgx8" Mar 20 16:26:04 crc kubenswrapper[4675]: I0320 16:26:04.836233 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-sn7kc"] Mar 20 16:26:04 crc kubenswrapper[4675]: I0320 16:26:04.845214 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-sn7kc"] Mar 20 16:26:06 crc kubenswrapper[4675]: I0320 16:26:06.686333 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709c8047-9593-4375-aaae-b982f574e0c0" path="/var/lib/kubelet/pods/709c8047-9593-4375-aaae-b982f574e0c0/volumes" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.357616 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8jc4p"] Mar 20 16:26:09 crc kubenswrapper[4675]: E0320 16:26:09.358686 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68eaed96-a9c7-4876-9610-73f080a5372b" containerName="oc" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.358698 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="68eaed96-a9c7-4876-9610-73f080a5372b" containerName="oc" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.358918 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="68eaed96-a9c7-4876-9610-73f080a5372b" containerName="oc" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.360159 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.370035 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8jc4p"] Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.454173 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6km86\" (UniqueName: \"kubernetes.io/projected/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-kube-api-access-6km86\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.454252 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-utilities\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.454310 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-catalog-content\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.556053 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6km86\" (UniqueName: \"kubernetes.io/projected/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-kube-api-access-6km86\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.556166 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-utilities\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.556202 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-catalog-content\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.556724 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-utilities\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.556880 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-catalog-content\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.577502 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6km86\" (UniqueName: \"kubernetes.io/projected/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-kube-api-access-6km86\") pod \"certified-operators-8jc4p\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:09 crc kubenswrapper[4675]: I0320 16:26:09.689590 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:10 crc kubenswrapper[4675]: I0320 16:26:10.234681 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8jc4p"] Mar 20 16:26:10 crc kubenswrapper[4675]: I0320 16:26:10.519950 4675 generic.go:334] "Generic (PLEG): container finished" podID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerID="23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1" exitCode=0 Mar 20 16:26:10 crc kubenswrapper[4675]: I0320 16:26:10.520001 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerDied","Data":"23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1"} Mar 20 16:26:10 crc kubenswrapper[4675]: I0320 16:26:10.520216 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerStarted","Data":"d8619b4658d8f2076fde53134317e22b39b4f7fc912a53c6ea4669c59e5d00dd"} Mar 20 16:26:11 crc kubenswrapper[4675]: I0320 16:26:11.530637 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerStarted","Data":"523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e"} Mar 20 16:26:12 crc kubenswrapper[4675]: I0320 16:26:12.542687 4675 generic.go:334] "Generic (PLEG): container finished" podID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerID="523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e" exitCode=0 Mar 20 16:26:12 crc kubenswrapper[4675]: I0320 16:26:12.542733 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerDied","Data":"523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e"} Mar 20 16:26:13 crc kubenswrapper[4675]: I0320 16:26:13.554507 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerStarted","Data":"b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9"} Mar 20 16:26:13 crc kubenswrapper[4675]: I0320 16:26:13.575003 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8jc4p" podStartSLOduration=2.180429851 podStartE2EDuration="4.57498762s" podCreationTimestamp="2026-03-20 16:26:09 +0000 UTC" firstStartedPulling="2026-03-20 16:26:10.52158913 +0000 UTC m=+1490.555218667" lastFinishedPulling="2026-03-20 16:26:12.916146889 +0000 UTC m=+1492.949776436" observedRunningTime="2026-03-20 16:26:13.573488637 +0000 UTC m=+1493.607118174" watchObservedRunningTime="2026-03-20 16:26:13.57498762 +0000 UTC m=+1493.608617147" Mar 20 16:26:19 crc kubenswrapper[4675]: I0320 16:26:19.690247 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:19 crc kubenswrapper[4675]: I0320 16:26:19.691061 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:19 crc kubenswrapper[4675]: I0320 16:26:19.736537 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:20 crc kubenswrapper[4675]: I0320 16:26:20.685326 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:20 crc kubenswrapper[4675]: I0320 16:26:20.734238 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8jc4p"] Mar 20 16:26:22 crc kubenswrapper[4675]: I0320 16:26:22.639497 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8jc4p" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="registry-server" containerID="cri-o://b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9" gracePeriod=2 Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.078268 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.228319 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-catalog-content\") pod \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.228407 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6km86\" (UniqueName: \"kubernetes.io/projected/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-kube-api-access-6km86\") pod \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.228522 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-utilities\") pod \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\" (UID: \"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0\") " Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.229523 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-utilities" (OuterVolumeSpecName: "utilities") pod "a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" (UID: "a619c2ea-d3ba-4d67-9018-e1f735b5e7c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.237096 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-kube-api-access-6km86" (OuterVolumeSpecName: "kube-api-access-6km86") pod "a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" (UID: "a619c2ea-d3ba-4d67-9018-e1f735b5e7c0"). InnerVolumeSpecName "kube-api-access-6km86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.279626 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" (UID: "a619c2ea-d3ba-4d67-9018-e1f735b5e7c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.330207 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.330463 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6km86\" (UniqueName: \"kubernetes.io/projected/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-kube-api-access-6km86\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.330545 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.651374 4675 generic.go:334] "Generic (PLEG): container finished" podID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerID="b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9" exitCode=0 Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.651419 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerDied","Data":"b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9"} Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.651447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8jc4p" event={"ID":"a619c2ea-d3ba-4d67-9018-e1f735b5e7c0","Type":"ContainerDied","Data":"d8619b4658d8f2076fde53134317e22b39b4f7fc912a53c6ea4669c59e5d00dd"} Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.651469 4675 scope.go:117] "RemoveContainer" containerID="b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.651496 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8jc4p" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.680509 4675 scope.go:117] "RemoveContainer" containerID="523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.685911 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8jc4p"] Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.696380 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8jc4p"] Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.710265 4675 scope.go:117] "RemoveContainer" containerID="23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.742106 4675 scope.go:117] "RemoveContainer" containerID="b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9" Mar 20 16:26:23 crc kubenswrapper[4675]: E0320 16:26:23.742631 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9\": container with ID starting with b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9 not found: ID does not exist" containerID="b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.742681 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9"} err="failed to get container status \"b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9\": rpc error: code = NotFound desc = could not find container \"b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9\": container with ID starting with b68593600e4523ef0dc20bbe842feb5a354f82065c9b6465385fb0afc16cbff9 not found: ID does not exist" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.742715 4675 scope.go:117] "RemoveContainer" containerID="523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e" Mar 20 16:26:23 crc kubenswrapper[4675]: E0320 16:26:23.743626 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e\": container with ID starting with 523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e not found: ID does not exist" containerID="523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.743666 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e"} err="failed to get container status \"523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e\": rpc error: code = NotFound desc = could not find container \"523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e\": container with ID starting with 523c89b668ad8b6e50bc577c6709e3a0961783357a6569a24afc6304db7de33e not found: ID does not exist" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.743707 4675 scope.go:117] "RemoveContainer" containerID="23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1" Mar 20 16:26:23 crc kubenswrapper[4675]: E0320 16:26:23.744037 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1\": container with ID starting with 23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1 not found: ID does not exist" containerID="23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1" Mar 20 16:26:23 crc kubenswrapper[4675]: I0320 16:26:23.744072 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1"} err="failed to get container status \"23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1\": rpc error: code = NotFound desc = could not find container \"23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1\": container with ID starting with 23c84fb888a95c0131217dfbaa94388fb438dc294e1a0a75b8d395cc7a6ab1e1 not found: ID does not exist" Mar 20 16:26:24 crc kubenswrapper[4675]: I0320 16:26:24.683679 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" path="/var/lib/kubelet/pods/a619c2ea-d3ba-4d67-9018-e1f735b5e7c0/volumes" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.596596 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t5gcc/must-gather-sl9zj"] Mar 20 16:26:28 crc kubenswrapper[4675]: E0320 16:26:28.597457 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="registry-server" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.597470 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="registry-server" Mar 20 16:26:28 crc kubenswrapper[4675]: E0320 16:26:28.597480 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="extract-utilities" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.597485 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="extract-utilities" Mar 20 16:26:28 crc kubenswrapper[4675]: E0320 16:26:28.597500 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="extract-content" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.597508 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="extract-content" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.597713 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a619c2ea-d3ba-4d67-9018-e1f735b5e7c0" containerName="registry-server" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.598666 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.604569 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t5gcc"/"kube-root-ca.crt" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.607146 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t5gcc"/"openshift-service-ca.crt" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.683882 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t5gcc/must-gather-sl9zj"] Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.728389 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a031bc01-da3d-4159-969f-a7509db918cd-must-gather-output\") pod \"must-gather-sl9zj\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.728679 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-955s6\" (UniqueName: \"kubernetes.io/projected/a031bc01-da3d-4159-969f-a7509db918cd-kube-api-access-955s6\") pod \"must-gather-sl9zj\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.837097 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a031bc01-da3d-4159-969f-a7509db918cd-must-gather-output\") pod \"must-gather-sl9zj\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.837302 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-955s6\" (UniqueName: \"kubernetes.io/projected/a031bc01-da3d-4159-969f-a7509db918cd-kube-api-access-955s6\") pod \"must-gather-sl9zj\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.837560 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a031bc01-da3d-4159-969f-a7509db918cd-must-gather-output\") pod \"must-gather-sl9zj\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.861640 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-955s6\" (UniqueName: \"kubernetes.io/projected/a031bc01-da3d-4159-969f-a7509db918cd-kube-api-access-955s6\") pod \"must-gather-sl9zj\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:28 crc kubenswrapper[4675]: I0320 16:26:28.920368 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:26:29 crc kubenswrapper[4675]: I0320 16:26:29.368482 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t5gcc/must-gather-sl9zj"] Mar 20 16:26:29 crc kubenswrapper[4675]: I0320 16:26:29.735215 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" event={"ID":"a031bc01-da3d-4159-969f-a7509db918cd","Type":"ContainerStarted","Data":"15ab304a55bdb8c83e6a878a3bd05ad134e0ed3888ed4a23c3210caa1fe937e9"} Mar 20 16:26:34 crc kubenswrapper[4675]: I0320 16:26:34.424448 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:26:34 crc kubenswrapper[4675]: I0320 16:26:34.424949 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.596690 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8dmp9"] Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.601273 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.621493 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dmp9"] Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.718650 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-utilities\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.718746 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxdr\" (UniqueName: \"kubernetes.io/projected/dd4a7e92-2c54-448a-a443-7721fe877dc9-kube-api-access-tpxdr\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.718849 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-catalog-content\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.820851 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-utilities\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.820911 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxdr\" (UniqueName: \"kubernetes.io/projected/dd4a7e92-2c54-448a-a443-7721fe877dc9-kube-api-access-tpxdr\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.820960 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-catalog-content\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.821596 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-utilities\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.821633 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-catalog-content\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.841842 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxdr\" (UniqueName: \"kubernetes.io/projected/dd4a7e92-2c54-448a-a443-7721fe877dc9-kube-api-access-tpxdr\") pod \"community-operators-8dmp9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:38 crc kubenswrapper[4675]: I0320 16:26:38.926821 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.481003 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dmp9"] Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.828441 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" event={"ID":"a031bc01-da3d-4159-969f-a7509db918cd","Type":"ContainerStarted","Data":"ea88def3f8b5e08517350161e03768e97f3dd7bbc04f107671326230a065b749"} Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.828797 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" event={"ID":"a031bc01-da3d-4159-969f-a7509db918cd","Type":"ContainerStarted","Data":"6e91531eff589939311fc0dbd4877952a478b0f4b1e30c85650b4e35d1d2a294"} Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.831661 4675 generic.go:334] "Generic (PLEG): container finished" podID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerID="13969c2b55f7ab9f2164675fb925cb8978945c36384905cc91bbf484e68c7029" exitCode=0 Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.831703 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerDied","Data":"13969c2b55f7ab9f2164675fb925cb8978945c36384905cc91bbf484e68c7029"} Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.831728 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerStarted","Data":"e89512b93294061c46ed071546c2112a392aeac150a1e29d30e4c5c878e3b222"} Mar 20 16:26:39 crc kubenswrapper[4675]: I0320 16:26:39.846174 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" podStartSLOduration=2.073500562 podStartE2EDuration="11.846153776s" podCreationTimestamp="2026-03-20 16:26:28 +0000 UTC" firstStartedPulling="2026-03-20 16:26:29.370611202 +0000 UTC m=+1509.404240739" lastFinishedPulling="2026-03-20 16:26:39.143264416 +0000 UTC m=+1519.176893953" observedRunningTime="2026-03-20 16:26:39.845454986 +0000 UTC m=+1519.879084523" watchObservedRunningTime="2026-03-20 16:26:39.846153776 +0000 UTC m=+1519.879783313" Mar 20 16:26:40 crc kubenswrapper[4675]: I0320 16:26:40.843783 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerStarted","Data":"f3214e1d233bc75fd8aa06b97e1edede8a0e669ebfa56e52be2e3195cdca4713"} Mar 20 16:26:41 crc kubenswrapper[4675]: I0320 16:26:41.854026 4675 generic.go:334] "Generic (PLEG): container finished" podID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerID="f3214e1d233bc75fd8aa06b97e1edede8a0e669ebfa56e52be2e3195cdca4713" exitCode=0 Mar 20 16:26:41 crc kubenswrapper[4675]: I0320 16:26:41.854065 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerDied","Data":"f3214e1d233bc75fd8aa06b97e1edede8a0e669ebfa56e52be2e3195cdca4713"} Mar 20 16:26:42 crc kubenswrapper[4675]: I0320 16:26:42.869473 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerStarted","Data":"423d551ac1545a4dcfb15c7dfdd1336aa65240c796f3c170333a89ae44f01fd6"} Mar 20 16:26:42 crc kubenswrapper[4675]: I0320 16:26:42.894142 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8dmp9" podStartSLOduration=2.455192207 podStartE2EDuration="4.894112191s" podCreationTimestamp="2026-03-20 16:26:38 +0000 UTC" firstStartedPulling="2026-03-20 16:26:39.833091647 +0000 UTC m=+1519.866721184" lastFinishedPulling="2026-03-20 16:26:42.272011631 +0000 UTC m=+1522.305641168" observedRunningTime="2026-03-20 16:26:42.886756493 +0000 UTC m=+1522.920386040" watchObservedRunningTime="2026-03-20 16:26:42.894112191 +0000 UTC m=+1522.927741728" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.019101 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t5gcc/crc-debug-l9h94"] Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.021122 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.023456 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t5gcc"/"default-dockercfg-rxbzd" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.164456 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jxbw\" (UniqueName: \"kubernetes.io/projected/ddd5d852-bfd3-4537-8147-2501fafc5e8e-kube-api-access-8jxbw\") pod \"crc-debug-l9h94\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.164520 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddd5d852-bfd3-4537-8147-2501fafc5e8e-host\") pod \"crc-debug-l9h94\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.265896 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jxbw\" (UniqueName: \"kubernetes.io/projected/ddd5d852-bfd3-4537-8147-2501fafc5e8e-kube-api-access-8jxbw\") pod \"crc-debug-l9h94\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.265961 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddd5d852-bfd3-4537-8147-2501fafc5e8e-host\") pod \"crc-debug-l9h94\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.266223 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddd5d852-bfd3-4537-8147-2501fafc5e8e-host\") pod \"crc-debug-l9h94\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.303086 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jxbw\" (UniqueName: \"kubernetes.io/projected/ddd5d852-bfd3-4537-8147-2501fafc5e8e-kube-api-access-8jxbw\") pod \"crc-debug-l9h94\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.344802 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:26:45 crc kubenswrapper[4675]: I0320 16:26:45.916052 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" event={"ID":"ddd5d852-bfd3-4537-8147-2501fafc5e8e","Type":"ContainerStarted","Data":"a0a5d9f38d76b3c1b665a06bf16d8ccea45e8b721c03f8033581b577c4b4f94f"} Mar 20 16:26:48 crc kubenswrapper[4675]: I0320 16:26:48.927610 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:48 crc kubenswrapper[4675]: I0320 16:26:48.927993 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:48 crc kubenswrapper[4675]: I0320 16:26:48.975433 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:49 crc kubenswrapper[4675]: I0320 16:26:49.040688 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:49 crc kubenswrapper[4675]: I0320 16:26:49.212465 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dmp9"] Mar 20 16:26:50 crc kubenswrapper[4675]: I0320 16:26:50.978357 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8dmp9" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="registry-server" containerID="cri-o://423d551ac1545a4dcfb15c7dfdd1336aa65240c796f3c170333a89ae44f01fd6" gracePeriod=2 Mar 20 16:26:51 crc kubenswrapper[4675]: I0320 16:26:51.988533 4675 generic.go:334] "Generic (PLEG): container finished" podID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerID="423d551ac1545a4dcfb15c7dfdd1336aa65240c796f3c170333a89ae44f01fd6" exitCode=0 Mar 20 16:26:51 crc kubenswrapper[4675]: I0320 16:26:51.988625 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerDied","Data":"423d551ac1545a4dcfb15c7dfdd1336aa65240c796f3c170333a89ae44f01fd6"} Mar 20 16:26:55 crc kubenswrapper[4675]: I0320 16:26:55.074687 4675 scope.go:117] "RemoveContainer" containerID="b787edcb44bf9ee2b1470910ae1cafe7fbe75892d4dcdd8e91be284e1a07d49f" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.039994 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.045779 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dmp9" event={"ID":"dd4a7e92-2c54-448a-a443-7721fe877dc9","Type":"ContainerDied","Data":"e89512b93294061c46ed071546c2112a392aeac150a1e29d30e4c5c878e3b222"} Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.045924 4675 scope.go:117] "RemoveContainer" containerID="423d551ac1545a4dcfb15c7dfdd1336aa65240c796f3c170333a89ae44f01fd6" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.046150 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dmp9" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.057036 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" event={"ID":"ddd5d852-bfd3-4537-8147-2501fafc5e8e","Type":"ContainerStarted","Data":"749b242d0f1ff3fc57d506d4686590503b48ea75aa912e7fd61b29c71b0fed5b"} Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.084738 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" podStartSLOduration=0.68164406 podStartE2EDuration="12.08472126s" podCreationTimestamp="2026-03-20 16:26:45 +0000 UTC" firstStartedPulling="2026-03-20 16:26:45.380603557 +0000 UTC m=+1525.414233094" lastFinishedPulling="2026-03-20 16:26:56.783680757 +0000 UTC m=+1536.817310294" observedRunningTime="2026-03-20 16:26:57.081333764 +0000 UTC m=+1537.114963301" watchObservedRunningTime="2026-03-20 16:26:57.08472126 +0000 UTC m=+1537.118350797" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.107841 4675 scope.go:117] "RemoveContainer" containerID="f3214e1d233bc75fd8aa06b97e1edede8a0e669ebfa56e52be2e3195cdca4713" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.130638 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-catalog-content\") pod \"dd4a7e92-2c54-448a-a443-7721fe877dc9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.130683 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-utilities\") pod \"dd4a7e92-2c54-448a-a443-7721fe877dc9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.130783 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxdr\" (UniqueName: \"kubernetes.io/projected/dd4a7e92-2c54-448a-a443-7721fe877dc9-kube-api-access-tpxdr\") pod \"dd4a7e92-2c54-448a-a443-7721fe877dc9\" (UID: \"dd4a7e92-2c54-448a-a443-7721fe877dc9\") " Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.131935 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-utilities" (OuterVolumeSpecName: "utilities") pod "dd4a7e92-2c54-448a-a443-7721fe877dc9" (UID: "dd4a7e92-2c54-448a-a443-7721fe877dc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.136664 4675 scope.go:117] "RemoveContainer" containerID="13969c2b55f7ab9f2164675fb925cb8978945c36384905cc91bbf484e68c7029" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.136908 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4a7e92-2c54-448a-a443-7721fe877dc9-kube-api-access-tpxdr" (OuterVolumeSpecName: "kube-api-access-tpxdr") pod "dd4a7e92-2c54-448a-a443-7721fe877dc9" (UID: "dd4a7e92-2c54-448a-a443-7721fe877dc9"). InnerVolumeSpecName "kube-api-access-tpxdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.187620 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd4a7e92-2c54-448a-a443-7721fe877dc9" (UID: "dd4a7e92-2c54-448a-a443-7721fe877dc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.232456 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxdr\" (UniqueName: \"kubernetes.io/projected/dd4a7e92-2c54-448a-a443-7721fe877dc9-kube-api-access-tpxdr\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.232657 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.232753 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4a7e92-2c54-448a-a443-7721fe877dc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.404316 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dmp9"] Mar 20 16:26:57 crc kubenswrapper[4675]: I0320 16:26:57.413093 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8dmp9"] Mar 20 16:26:58 crc kubenswrapper[4675]: I0320 16:26:58.684105 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" path="/var/lib/kubelet/pods/dd4a7e92-2c54-448a-a443-7721fe877dc9/volumes" Mar 20 16:27:04 crc kubenswrapper[4675]: I0320 16:27:04.424997 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:27:04 crc kubenswrapper[4675]: I0320 16:27:04.425507 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:27:14 crc kubenswrapper[4675]: I0320 16:27:14.218698 4675 generic.go:334] "Generic (PLEG): container finished" podID="ddd5d852-bfd3-4537-8147-2501fafc5e8e" containerID="749b242d0f1ff3fc57d506d4686590503b48ea75aa912e7fd61b29c71b0fed5b" exitCode=0 Mar 20 16:27:14 crc kubenswrapper[4675]: I0320 16:27:14.218856 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" event={"ID":"ddd5d852-bfd3-4537-8147-2501fafc5e8e","Type":"ContainerDied","Data":"749b242d0f1ff3fc57d506d4686590503b48ea75aa912e7fd61b29c71b0fed5b"} Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.330793 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.359935 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t5gcc/crc-debug-l9h94"] Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.367265 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t5gcc/crc-debug-l9h94"] Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.466607 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddd5d852-bfd3-4537-8147-2501fafc5e8e-host\") pod \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.466745 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jxbw\" (UniqueName: \"kubernetes.io/projected/ddd5d852-bfd3-4537-8147-2501fafc5e8e-kube-api-access-8jxbw\") pod \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\" (UID: \"ddd5d852-bfd3-4537-8147-2501fafc5e8e\") " Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.466755 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddd5d852-bfd3-4537-8147-2501fafc5e8e-host" (OuterVolumeSpecName: "host") pod "ddd5d852-bfd3-4537-8147-2501fafc5e8e" (UID: "ddd5d852-bfd3-4537-8147-2501fafc5e8e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.467232 4675 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddd5d852-bfd3-4537-8147-2501fafc5e8e-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.472880 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd5d852-bfd3-4537-8147-2501fafc5e8e-kube-api-access-8jxbw" (OuterVolumeSpecName: "kube-api-access-8jxbw") pod "ddd5d852-bfd3-4537-8147-2501fafc5e8e" (UID: "ddd5d852-bfd3-4537-8147-2501fafc5e8e"). InnerVolumeSpecName "kube-api-access-8jxbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:27:15 crc kubenswrapper[4675]: I0320 16:27:15.569662 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jxbw\" (UniqueName: \"kubernetes.io/projected/ddd5d852-bfd3-4537-8147-2501fafc5e8e-kube-api-access-8jxbw\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.238388 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0a5d9f38d76b3c1b665a06bf16d8ccea45e8b721c03f8033581b577c4b4f94f" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.238458 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-l9h94" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539097 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t5gcc/crc-debug-4cbft"] Mar 20 16:27:16 crc kubenswrapper[4675]: E0320 16:27:16.539396 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="extract-content" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539408 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="extract-content" Mar 20 16:27:16 crc kubenswrapper[4675]: E0320 16:27:16.539419 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="extract-utilities" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539425 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="extract-utilities" Mar 20 16:27:16 crc kubenswrapper[4675]: E0320 16:27:16.539443 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="registry-server" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539449 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="registry-server" Mar 20 16:27:16 crc kubenswrapper[4675]: E0320 16:27:16.539466 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd5d852-bfd3-4537-8147-2501fafc5e8e" containerName="container-00" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539472 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd5d852-bfd3-4537-8147-2501fafc5e8e" containerName="container-00" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539634 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd5d852-bfd3-4537-8147-2501fafc5e8e" containerName="container-00" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.539656 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4a7e92-2c54-448a-a443-7721fe877dc9" containerName="registry-server" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.540183 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.541794 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t5gcc"/"default-dockercfg-rxbzd" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.685689 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd5d852-bfd3-4537-8147-2501fafc5e8e" path="/var/lib/kubelet/pods/ddd5d852-bfd3-4537-8147-2501fafc5e8e/volumes" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.689998 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daca9405-d375-409d-90a2-9899c6fde998-host\") pod \"crc-debug-4cbft\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.690064 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnkx\" (UniqueName: \"kubernetes.io/projected/daca9405-d375-409d-90a2-9899c6fde998-kube-api-access-ncnkx\") pod \"crc-debug-4cbft\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.791402 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daca9405-d375-409d-90a2-9899c6fde998-host\") pod \"crc-debug-4cbft\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.791509 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnkx\" (UniqueName: \"kubernetes.io/projected/daca9405-d375-409d-90a2-9899c6fde998-kube-api-access-ncnkx\") pod \"crc-debug-4cbft\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.792071 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daca9405-d375-409d-90a2-9899c6fde998-host\") pod \"crc-debug-4cbft\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.818109 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnkx\" (UniqueName: \"kubernetes.io/projected/daca9405-d375-409d-90a2-9899c6fde998-kube-api-access-ncnkx\") pod \"crc-debug-4cbft\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: I0320 16:27:16.856521 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:16 crc kubenswrapper[4675]: W0320 16:27:16.913874 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaca9405_d375_409d_90a2_9899c6fde998.slice/crio-39e1c34c2ee2bce4f1894eae79bf38a67c53747f27beb908c65afacd0b691c07 WatchSource:0}: Error finding container 39e1c34c2ee2bce4f1894eae79bf38a67c53747f27beb908c65afacd0b691c07: Status 404 returned error can't find the container with id 39e1c34c2ee2bce4f1894eae79bf38a67c53747f27beb908c65afacd0b691c07 Mar 20 16:27:17 crc kubenswrapper[4675]: I0320 16:27:17.249164 4675 generic.go:334] "Generic (PLEG): container finished" podID="daca9405-d375-409d-90a2-9899c6fde998" containerID="d49141be2a6465571870772e189d1a327fc5f01258a33e4b431edef2314d3d19" exitCode=1 Mar 20 16:27:17 crc kubenswrapper[4675]: I0320 16:27:17.249345 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/crc-debug-4cbft" event={"ID":"daca9405-d375-409d-90a2-9899c6fde998","Type":"ContainerDied","Data":"d49141be2a6465571870772e189d1a327fc5f01258a33e4b431edef2314d3d19"} Mar 20 16:27:17 crc kubenswrapper[4675]: I0320 16:27:17.249455 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/crc-debug-4cbft" event={"ID":"daca9405-d375-409d-90a2-9899c6fde998","Type":"ContainerStarted","Data":"39e1c34c2ee2bce4f1894eae79bf38a67c53747f27beb908c65afacd0b691c07"} Mar 20 16:27:17 crc kubenswrapper[4675]: I0320 16:27:17.292227 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t5gcc/crc-debug-4cbft"] Mar 20 16:27:17 crc kubenswrapper[4675]: I0320 16:27:17.303378 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t5gcc/crc-debug-4cbft"] Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.363939 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.425925 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daca9405-d375-409d-90a2-9899c6fde998-host\") pod \"daca9405-d375-409d-90a2-9899c6fde998\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.425996 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncnkx\" (UniqueName: \"kubernetes.io/projected/daca9405-d375-409d-90a2-9899c6fde998-kube-api-access-ncnkx\") pod \"daca9405-d375-409d-90a2-9899c6fde998\" (UID: \"daca9405-d375-409d-90a2-9899c6fde998\") " Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.426082 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daca9405-d375-409d-90a2-9899c6fde998-host" (OuterVolumeSpecName: "host") pod "daca9405-d375-409d-90a2-9899c6fde998" (UID: "daca9405-d375-409d-90a2-9899c6fde998"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.426533 4675 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daca9405-d375-409d-90a2-9899c6fde998-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.436018 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daca9405-d375-409d-90a2-9899c6fde998-kube-api-access-ncnkx" (OuterVolumeSpecName: "kube-api-access-ncnkx") pod "daca9405-d375-409d-90a2-9899c6fde998" (UID: "daca9405-d375-409d-90a2-9899c6fde998"). InnerVolumeSpecName "kube-api-access-ncnkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.528781 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncnkx\" (UniqueName: \"kubernetes.io/projected/daca9405-d375-409d-90a2-9899c6fde998-kube-api-access-ncnkx\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:18 crc kubenswrapper[4675]: I0320 16:27:18.685317 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daca9405-d375-409d-90a2-9899c6fde998" path="/var/lib/kubelet/pods/daca9405-d375-409d-90a2-9899c6fde998/volumes" Mar 20 16:27:19 crc kubenswrapper[4675]: I0320 16:27:19.267564 4675 scope.go:117] "RemoveContainer" containerID="d49141be2a6465571870772e189d1a327fc5f01258a33e4b431edef2314d3d19" Mar 20 16:27:19 crc kubenswrapper[4675]: I0320 16:27:19.267603 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/crc-debug-4cbft" Mar 20 16:27:34 crc kubenswrapper[4675]: I0320 16:27:34.424940 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:27:34 crc kubenswrapper[4675]: I0320 16:27:34.425755 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:27:34 crc kubenswrapper[4675]: I0320 16:27:34.426652 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:27:34 crc kubenswrapper[4675]: I0320 16:27:34.429105 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:27:34 crc kubenswrapper[4675]: I0320 16:27:34.429179 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" gracePeriod=600 Mar 20 16:27:34 crc kubenswrapper[4675]: E0320 16:27:34.549828 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:27:35 crc kubenswrapper[4675]: I0320 16:27:35.406473 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" exitCode=0 Mar 20 16:27:35 crc kubenswrapper[4675]: I0320 16:27:35.406564 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3"} Mar 20 16:27:35 crc kubenswrapper[4675]: I0320 16:27:35.406847 4675 scope.go:117] "RemoveContainer" containerID="7732914d5ec5c37cec22ffa5532f80bae40c4bcbf0ea409824aff0266bbf1edb" Mar 20 16:27:35 crc kubenswrapper[4675]: I0320 16:27:35.407478 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:27:35 crc kubenswrapper[4675]: E0320 16:27:35.407793 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.264819 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d6cd9476b-xwbkm_32e344b5-b713-4104-ac00-0793fd3e94d9/barbican-api/0.log" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.428210 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d6cd9476b-xwbkm_32e344b5-b713-4104-ac00-0793fd3e94d9/barbican-api-log/0.log" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.520285 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-b060-account-create-update-5gxch_45c297c7-2162-4bd6-bd83-db8bbc61d008/mariadb-account-create-update/0.log" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.651677 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-kwslq_9503b7d8-c02c-4d31-9711-271cd2be4778/mariadb-database-create/0.log" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.674259 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:27:47 crc kubenswrapper[4675]: E0320 16:27:47.674494 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.720251 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-hhfc9_c1560aa0-d06c-4c98-80bf-0635065cac6f/barbican-db-sync/0.log" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.906539 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64fc78bb94-nfn76_ca93e3f6-95e2-4973-8100-94e89ad3515b/barbican-keystone-listener/0.log" Mar 20 16:27:47 crc kubenswrapper[4675]: I0320 16:27:47.966255 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64fc78bb94-nfn76_ca93e3f6-95e2-4973-8100-94e89ad3515b/barbican-keystone-listener-log/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.074587 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cc6bc59f9-6rgrv_780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9/barbican-worker/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.134968 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cc6bc59f9-6rgrv_780e2ce3-c4a3-4b3f-a9b7-b0191679e0c9/barbican-worker-log/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.251288 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_179982ec-516c-43ea-b479-9e9309759410/ceilometer-central-agent/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.303158 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_179982ec-516c-43ea-b479-9e9309759410/ceilometer-notification-agent/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.363786 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_179982ec-516c-43ea-b479-9e9309759410/proxy-httpd/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.425325 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_179982ec-516c-43ea-b479-9e9309759410/sg-core/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.512546 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cab423d6-a026-41a9-8ae7-7ab2339de5ef/cinder-api/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.702371 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cab423d6-a026-41a9-8ae7-7ab2339de5ef/cinder-api-log/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.720481 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-d571-account-create-update-ghkph_a0b7b710-7048-42c1-8215-c242f34da40f/mariadb-account-create-update/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.937540 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-jp9g7_2e6fe44b-0699-4235-af89-d546820b782a/mariadb-database-create/0.log" Mar 20 16:27:48 crc kubenswrapper[4675]: I0320 16:27:48.989273 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-tt28r_37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43/cinder-db-sync/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.157900 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_feac59ee-65ab-4f54-a829-fdea75fd800b/probe/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.211968 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_feac59ee-65ab-4f54-a829-fdea75fd800b/cinder-scheduler/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.251013 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59cf4bdb65-m255x_e14e9637-edff-450c-ad95-6c0367ee120d/init/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.457336 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59cf4bdb65-m255x_e14e9637-edff-450c-ad95-6c0367ee120d/dnsmasq-dns/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.498877 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59cf4bdb65-m255x_e14e9637-edff-450c-ad95-6c0367ee120d/init/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.515377 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-1c94-account-create-update-gs974_561932af-1ef9-47ff-9da6-b661477b60ae/mariadb-account-create-update/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.648134 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-jrt74_5c7deb55-a7bb-4207-822f-c348a40ee473/mariadb-database-create/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.707243 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-ddcp2_97e8338f-ae50-4341-aba0-91bf9890a9bc/glance-db-sync/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.895453 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f9de0228-878e-4311-8146-93fdff40b851/glance-httpd/0.log" Mar 20 16:27:49 crc kubenswrapper[4675]: I0320 16:27:49.909405 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f9de0228-878e-4311-8146-93fdff40b851/glance-log/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.119814 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4be7b0e7-cc04-4551-9775-b231792b3e25/glance-httpd/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.144880 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4be7b0e7-cc04-4551-9775-b231792b3e25/glance-log/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.269862 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d79d4db6d-vnw9g_b81ab73e-24ce-451b-9064-d6ebea2c5976/horizon/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.413199 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d79d4db6d-vnw9g_b81ab73e-24ce-451b-9064-d6ebea2c5976/horizon-log/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.419813 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-3b23-account-create-update-zhhwz_b1d96077-f705-44cc-a64d-dd4d7df551a6/mariadb-account-create-update/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.708280 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-zlp5c_de350b0e-5712-4f65-b01b-27814457bee4/keystone-bootstrap/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.775252 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5f7ccf99c9-m6s8x_16cf399d-ef4a-4572-a3c3-73e30bb2a54c/keystone-api/0.log" Mar 20 16:27:50 crc kubenswrapper[4675]: I0320 16:27:50.938346 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-qwg4d_4aee7127-2ed6-4ece-9eaa-0dfda0be02ad/mariadb-database-create/0.log" Mar 20 16:27:51 crc kubenswrapper[4675]: I0320 16:27:51.015802 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-wsz8z_3f81f694-f0b5-4f31-a090-748418d6fd08/keystone-db-sync/0.log" Mar 20 16:27:51 crc kubenswrapper[4675]: I0320 16:27:51.166073 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8f2c52ef-b68a-4077-80f4-455f7feb3f0e/kube-state-metrics/0.log" Mar 20 16:27:51 crc kubenswrapper[4675]: I0320 16:27:51.389058 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f9ffbdb49-tcvn8_c411bc97-1398-438f-a194-1d53f896f405/neutron-api/0.log" Mar 20 16:27:51 crc kubenswrapper[4675]: I0320 16:27:51.440292 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f9ffbdb49-tcvn8_c411bc97-1398-438f-a194-1d53f896f405/neutron-httpd/0.log" Mar 20 16:27:51 crc kubenswrapper[4675]: I0320 16:27:51.660159 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7038-account-create-update-9c4gc_42b97a81-a094-4cb5-ba97-33eb354c1d97/mariadb-account-create-update/0.log" Mar 20 16:27:51 crc kubenswrapper[4675]: I0320 16:27:51.933444 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-zj9wf_931f540a-ffd9-4d4a-b001-f68408fa02fb/mariadb-database-create/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.114381 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-ddfbw_7935d7aa-cb6b-4b66-a58f-31e0cce41114/neutron-db-sync/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.216436 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43/nova-api-api/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.315710 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-9kb49_93745d9e-fb27-46b1-9305-de6265b0cc8d/mariadb-database-create/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.339701 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_59a9fb1c-a12a-48bc-98a8-fcf0b1d1bf43/nova-api-log/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.478893 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-fe47-account-create-update-gnq8t_dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411/mariadb-account-create-update/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.598297 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-gjbkq_3f80fe29-bb20-44a9-a687-24bc81243833/nova-manage/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.795172 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_965c51c2-a3e5-46f3-8a76-3cdb812c96c6/nova-cell0-conductor-conductor/0.log" Mar 20 16:27:52 crc kubenswrapper[4675]: I0320 16:27:52.818167 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-cvcgw_e5eba599-99e1-4899-8ae7-0ba38e60724b/nova-cell0-conductor-db-sync/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.070280 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-pkw66_6d340bca-db9e-4748-9cad-c3856ffe6edf/mariadb-database-create/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.077761 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-f9fc-account-create-update-qwrrp_474cfa15-2932-4b81-a00e-fc9c6648e91b/mariadb-account-create-update/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.260660 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-6ae5-account-create-update-7mtsv_a18287f0-a719-4ea8-badd-3f2f13bd4209/mariadb-account-create-update/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.313928 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-t9dln_8094317b-8891-4a65-ac16-3211f0209ba4/nova-manage/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.660819 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-zbgx5_b9ddb96c-6950-4300-bf0d-cf65d46c12fb/nova-cell1-conductor-db-sync/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.748821 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fa48124b-637f-43d1-9136-24218d69e177/nova-cell1-conductor-conductor/0.log" Mar 20 16:27:53 crc kubenswrapper[4675]: I0320 16:27:53.839688 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-295ch_bcc971d1-4036-4338-80ba-8f4f00c10b2a/mariadb-database-create/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.116823 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a3395380-db60-4ec1-9526-0a0796b45d73/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.207469 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b34a59d7-ed22-4f2c-8214-0c69b352bbb1/nova-metadata-log/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.252805 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b34a59d7-ed22-4f2c-8214-0c69b352bbb1/nova-metadata-metadata/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.491205 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1c6bc1a7-15cb-45dd-a6e6-cb1fd77f8d1f/nova-scheduler-scheduler/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.491754 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_47fb8c80-d4bd-42fb-bcc3-752f854574b4/mysql-bootstrap/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.751280 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_acbf924d-7363-4489-a64a-51c2949a2a69/mysql-bootstrap/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.773358 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_47fb8c80-d4bd-42fb-bcc3-752f854574b4/galera/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.777945 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_47fb8c80-d4bd-42fb-bcc3-752f854574b4/mysql-bootstrap/0.log" Mar 20 16:27:54 crc kubenswrapper[4675]: I0320 16:27:54.944686 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_acbf924d-7363-4489-a64a-51c2949a2a69/mysql-bootstrap/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.028592 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_acbf924d-7363-4489-a64a-51c2949a2a69/galera/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.072047 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_226f3568-6214-4345-9991-3bda09594c67/openstackclient/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.252562 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-b5cf6_b563a826-d7ed-453e-89f6-aec33699291e/ovn-controller/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.273903 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-np69v_694ef288-9c84-4800-8f0c-aa30aa0c74a0/openstack-network-exporter/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.467514 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-prjxt_6caea9dd-db8f-4f21-b684-5899258ff290/ovsdb-server-init/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.707159 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-prjxt_6caea9dd-db8f-4f21-b684-5899258ff290/ovs-vswitchd/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.729293 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-prjxt_6caea9dd-db8f-4f21-b684-5899258ff290/ovsdb-server-init/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.754455 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-prjxt_6caea9dd-db8f-4f21-b684-5899258ff290/ovsdb-server/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.914444 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_53eea136-525b-482e-99ed-7f280dce9186/ovn-northd/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.931383 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_53eea136-525b-482e-99ed-7f280dce9186/openstack-network-exporter/0.log" Mar 20 16:27:55 crc kubenswrapper[4675]: I0320 16:27:55.968374 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc/openstack-network-exporter/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.170989 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_67c80b11-0763-4407-ad6b-5f1fef8ad591/openstack-network-exporter/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.174612 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1e0a38ce-32e7-4af9-a6d3-a2ed52e644cc/ovsdbserver-nb/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.237292 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_67c80b11-0763-4407-ad6b-5f1fef8ad591/ovsdbserver-sb/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.390916 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-1bbb-account-create-update-g82mn_5ccf7599-4ec9-4023-b6a4-9517656c82f8/mariadb-account-create-update/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.500430 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-cf47f4dbd-zbrxj_faf8d346-4de8-473c-8d9c-a5d1cf895e4e/placement-api/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.640453 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-272fr_b5a8634b-cae4-44ff-b49b-3c2c12ef93fe/mariadb-database-create/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.643894 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-cf47f4dbd-zbrxj_faf8d346-4de8-473c-8d9c-a5d1cf895e4e/placement-log/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.797911 4675 scope.go:117] "RemoveContainer" containerID="973584f860f0c55b564ad7fd4b53111645f156751d4ae58a9eefc3f8838d78d9" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.820859 4675 scope.go:117] "RemoveContainer" containerID="e4fafa668cb38ec178f91b7602896b9cc49b97408a7a24cee7d092a92576bfb6" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.822224 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f2786789-8885-42c4-9127-c0466e2212eb/setup-container/0.log" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.864138 4675 scope.go:117] "RemoveContainer" containerID="3ad19f1a9a8a7d3ebfb1898bad183c5681bbc9300f4db3816e88fe707b0a7e03" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.907146 4675 scope.go:117] "RemoveContainer" containerID="e2062d5335da5b6ff70f634814cdf8afc41192c45da4e6d162288acbf1b264c7" Mar 20 16:27:56 crc kubenswrapper[4675]: I0320 16:27:56.923417 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-7tdnv_653f25dd-b7f2-4ec1-8569-96af48c4c388/placement-db-sync/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.093352 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f2786789-8885-42c4-9127-c0466e2212eb/setup-container/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.166055 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f2786789-8885-42c4-9127-c0466e2212eb/rabbitmq/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.191415 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_87f2f4be-70c8-409a-8fe8-c753758021f4/setup-container/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.373049 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_87f2f4be-70c8-409a-8fe8-c753758021f4/setup-container/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.460605 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-zdp9x_36a4f6bb-496f-4c50-9047-057827aefe77/mariadb-account-create-update/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.466666 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_87f2f4be-70c8-409a-8fe8-c753758021f4/rabbitmq/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.645733 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-564d9f6b4c-5p6js_28cb2c9b-231d-473b-bbd2-ba4be8c6787e/proxy-httpd/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.673266 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-564d9f6b4c-5p6js_28cb2c9b-231d-473b-bbd2-ba4be8c6787e/proxy-server/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.810482 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-g2g29_011fcaf2-19dd-4b94-98c8-ba1ba81cd656/swift-ring-rebalance/0.log" Mar 20 16:27:57 crc kubenswrapper[4675]: I0320 16:27:57.915432 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/account-auditor/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.010413 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/account-reaper/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.031968 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/account-replicator/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.111048 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/account-server/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.191127 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/container-auditor/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.280850 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/container-server/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.290475 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/container-replicator/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.343677 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/container-updater/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.441656 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/object-auditor/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.455457 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/object-expirer/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.502138 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/object-replicator/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.586583 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/object-server/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.624690 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/object-updater/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.669779 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/swift-recon-cron/0.log" Mar 20 16:27:58 crc kubenswrapper[4675]: I0320 16:27:58.686866 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1c15f64a-1ad0-4072-9f52-3b151c01a21b/rsync/0.log" Mar 20 16:27:59 crc kubenswrapper[4675]: I0320 16:27:59.424896 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7667dc1e-d72a-4119-8e30-a8267d0149f4/memcached/0.log" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.147189 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567068-9srgg"] Mar 20 16:28:00 crc kubenswrapper[4675]: E0320 16:28:00.147834 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca9405-d375-409d-90a2-9899c6fde998" containerName="container-00" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.147849 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca9405-d375-409d-90a2-9899c6fde998" containerName="container-00" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.148042 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="daca9405-d375-409d-90a2-9899c6fde998" containerName="container-00" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.148878 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.151159 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.152313 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.152320 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.157158 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-9srgg"] Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.223002 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4c8q\" (UniqueName: \"kubernetes.io/projected/17750158-67c7-45d9-9ed2-ae8f2640dc0b-kube-api-access-d4c8q\") pod \"auto-csr-approver-29567068-9srgg\" (UID: \"17750158-67c7-45d9-9ed2-ae8f2640dc0b\") " pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.324671 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4c8q\" (UniqueName: \"kubernetes.io/projected/17750158-67c7-45d9-9ed2-ae8f2640dc0b-kube-api-access-d4c8q\") pod \"auto-csr-approver-29567068-9srgg\" (UID: \"17750158-67c7-45d9-9ed2-ae8f2640dc0b\") " pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.353979 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4c8q\" (UniqueName: \"kubernetes.io/projected/17750158-67c7-45d9-9ed2-ae8f2640dc0b-kube-api-access-d4c8q\") pod \"auto-csr-approver-29567068-9srgg\" (UID: \"17750158-67c7-45d9-9ed2-ae8f2640dc0b\") " pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.468107 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.685597 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:28:00 crc kubenswrapper[4675]: E0320 16:28:00.686299 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:28:00 crc kubenswrapper[4675]: I0320 16:28:00.951992 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-9srgg"] Mar 20 16:28:01 crc kubenswrapper[4675]: I0320 16:28:01.670741 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-9srgg" event={"ID":"17750158-67c7-45d9-9ed2-ae8f2640dc0b","Type":"ContainerStarted","Data":"ade59b3b13dea3bbb89fffebdd0b13d81c98209147f293bfc9f830f807a1d3e1"} Mar 20 16:28:03 crc kubenswrapper[4675]: I0320 16:28:03.689933 4675 generic.go:334] "Generic (PLEG): container finished" podID="17750158-67c7-45d9-9ed2-ae8f2640dc0b" containerID="cacb6124c7d528e99608015aa4f351817bac0734595162013a7e75c753171270" exitCode=0 Mar 20 16:28:03 crc kubenswrapper[4675]: I0320 16:28:03.689984 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-9srgg" event={"ID":"17750158-67c7-45d9-9ed2-ae8f2640dc0b","Type":"ContainerDied","Data":"cacb6124c7d528e99608015aa4f351817bac0734595162013a7e75c753171270"} Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.129146 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.208495 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4c8q\" (UniqueName: \"kubernetes.io/projected/17750158-67c7-45d9-9ed2-ae8f2640dc0b-kube-api-access-d4c8q\") pod \"17750158-67c7-45d9-9ed2-ae8f2640dc0b\" (UID: \"17750158-67c7-45d9-9ed2-ae8f2640dc0b\") " Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.213671 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17750158-67c7-45d9-9ed2-ae8f2640dc0b-kube-api-access-d4c8q" (OuterVolumeSpecName: "kube-api-access-d4c8q") pod "17750158-67c7-45d9-9ed2-ae8f2640dc0b" (UID: "17750158-67c7-45d9-9ed2-ae8f2640dc0b"). InnerVolumeSpecName "kube-api-access-d4c8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.310840 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4c8q\" (UniqueName: \"kubernetes.io/projected/17750158-67c7-45d9-9ed2-ae8f2640dc0b-kube-api-access-d4c8q\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.713321 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-9srgg" event={"ID":"17750158-67c7-45d9-9ed2-ae8f2640dc0b","Type":"ContainerDied","Data":"ade59b3b13dea3bbb89fffebdd0b13d81c98209147f293bfc9f830f807a1d3e1"} Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.713361 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade59b3b13dea3bbb89fffebdd0b13d81c98209147f293bfc9f830f807a1d3e1" Mar 20 16:28:05 crc kubenswrapper[4675]: I0320 16:28:05.713379 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-9srgg" Mar 20 16:28:06 crc kubenswrapper[4675]: I0320 16:28:06.195838 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-zt586"] Mar 20 16:28:06 crc kubenswrapper[4675]: I0320 16:28:06.204676 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-zt586"] Mar 20 16:28:06 crc kubenswrapper[4675]: I0320 16:28:06.707627 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d5bdd9-188b-429c-a240-41424a96b5e4" path="/var/lib/kubelet/pods/04d5bdd9-188b-429c-a240-41424a96b5e4/volumes" Mar 20 16:28:14 crc kubenswrapper[4675]: I0320 16:28:14.673736 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:28:14 crc kubenswrapper[4675]: E0320 16:28:14.674531 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.275526 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-jrphq_19624364-2dec-43f0-961c-12c5071289fd/manager/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.459190 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/util/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.602472 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/util/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.631509 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/pull/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.661562 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/pull/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.889247 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/pull/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.916486 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/util/0.log" Mar 20 16:28:20 crc kubenswrapper[4675]: I0320 16:28:20.949282 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd747btv4_3f783bd0-ccf7-4dab-9412-6f2b1a943424/extract/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.120429 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-ktbv8_ec2c3dde-b80d-4baa-a092-c38d978c7c4e/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.242433 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-c6j4l_56bf5935-2a04-4182-8db7-8b98736a96fa/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.404074 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-k6lwh_5ab24f7d-1842-40a4-8ab1-a0299644ecd5/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.478960 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-m6dhl_26f79a1c-7e90-4c87-8027-4044ec669321/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.670593 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-vrwzl_fdf0cf19-b7ab-4ea3-aa58-af2d8c4a335d/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.880906 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-gvrbv_5ce8a4db-9cfd-45da-82be-597b6f3b1257/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.956138 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-8d4c8954d-flsfk_597a22d0-7193-41e5-8312-8cc9aa9a29a8/manager/0.log" Mar 20 16:28:21 crc kubenswrapper[4675]: I0320 16:28:21.979942 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-fpwf9_1c0ae0a7-9969-47b3-871f-536af2bd1784/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.103852 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-tg6fk_34bb9539-fd7b-49da-99dd-548e9e8de389/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.240317 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-q4k6l_5718414d-60fc-4af8-a8aa-46a12b8114ab/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.301194 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-msv57_c4f8d16f-3951-48ab-8525-79be59c6d957/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.516697 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-fvx4c_14e35dfe-49a2-4d89-9404-3ef0311be41e/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.525174 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-bk29d_427a4fcc-2562-4dc5-8735-0f6a448533ab/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.711018 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5nv6w9_45153c87-e6ea-4463-9a37-4bc2805530f8/manager/0.log" Mar 20 16:28:22 crc kubenswrapper[4675]: I0320 16:28:22.852419 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-94465cd74-nrsq2_7dfe43c8-9d92-424d-8f75-f4afffd29901/operator/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.250216 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7lv49_b65b6761-2183-4ab2-9c85-835a172cd2ee/registry-server/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.417905 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-hrp5b_4a01ec4a-d453-44a0-ae13-ef8607f3ccb3/manager/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.562708 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-mtkg9_442371dc-c0af-48c8-83fa-01012b636590/manager/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.676430 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-jh8ng_1cc3ed1d-cda9-47fc-b8a8-8c22e7dac6a7/manager/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.778516 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b5b55fc46-skxzd_74e6d0e1-86b4-4189-9dea-6ff21a5e5e3b/manager/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.847247 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-zngbt_9e05e473-0f8c-41a4-8b84-2c0c0d867f90/manager/0.log" Mar 20 16:28:23 crc kubenswrapper[4675]: I0320 16:28:23.996045 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-tphgw_9e316228-bcfc-4362-9786-67097a7b0730/manager/0.log" Mar 20 16:28:24 crc kubenswrapper[4675]: I0320 16:28:24.088720 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-bbdh2_762b742c-1a1d-4a82-8f60-b71e9fe44637/manager/0.log" Mar 20 16:28:25 crc kubenswrapper[4675]: I0320 16:28:25.674322 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:28:25 crc kubenswrapper[4675]: E0320 16:28:25.674906 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:28:36 crc kubenswrapper[4675]: I0320 16:28:36.674705 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:28:36 crc kubenswrapper[4675]: E0320 16:28:36.675491 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:28:42 crc kubenswrapper[4675]: I0320 16:28:42.320398 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ms29d_6152960d-fbba-4874-9127-cdd83b1d9d7a/control-plane-machine-set-operator/0.log" Mar 20 16:28:42 crc kubenswrapper[4675]: I0320 16:28:42.537736 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wzl6m_ceebe5f6-3cee-41d3-ab16-9d562cff84f8/kube-rbac-proxy/0.log" Mar 20 16:28:42 crc kubenswrapper[4675]: I0320 16:28:42.548245 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wzl6m_ceebe5f6-3cee-41d3-ab16-9d562cff84f8/machine-api-operator/0.log" Mar 20 16:28:47 crc kubenswrapper[4675]: I0320 16:28:47.673782 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:28:47 crc kubenswrapper[4675]: E0320 16:28:47.674549 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:28:53 crc kubenswrapper[4675]: I0320 16:28:53.954005 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ndpfw_b4d419db-f864-4092-88c5-dd3c49f133ee/cert-manager-controller/0.log" Mar 20 16:28:54 crc kubenswrapper[4675]: I0320 16:28:54.125390 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-69dsg_64366dbb-dc3a-4f63-a27e-c8f8b1b2e95a/cert-manager-cainjector/0.log" Mar 20 16:28:54 crc kubenswrapper[4675]: I0320 16:28:54.230225 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-r8sbm_1e52063a-3614-4b21-8411-d54225a3f2ed/cert-manager-webhook/0.log" Mar 20 16:28:57 crc kubenswrapper[4675]: I0320 16:28:57.164384 4675 scope.go:117] "RemoveContainer" containerID="6186dbac985c54733a1640322c538c03ab0fe9a306fd411d2f745642ee54de5f" Mar 20 16:28:57 crc kubenswrapper[4675]: I0320 16:28:57.184688 4675 scope.go:117] "RemoveContainer" containerID="94241f316e66ac5988644a931eae1198f8d67b4991a0efd911a7d74bc29be0f7" Mar 20 16:28:57 crc kubenswrapper[4675]: I0320 16:28:57.351341 4675 scope.go:117] "RemoveContainer" containerID="fd6a66917b7f0e5cf198aacb7bfd72ffce1fefd8c9299f3058986639f90435b2" Mar 20 16:28:57 crc kubenswrapper[4675]: I0320 16:28:57.384796 4675 scope.go:117] "RemoveContainer" containerID="f7ec49d56f30163de0e7cac0297ccb3050fb22c3498b7930d7e7d60846e5a65e" Mar 20 16:29:00 crc kubenswrapper[4675]: I0320 16:29:00.678943 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:29:00 crc kubenswrapper[4675]: E0320 16:29:00.679605 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:29:05 crc kubenswrapper[4675]: I0320 16:29:05.679543 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-hkvrt_48914fc8-8ca5-43e7-9048-11d34e7d4ed4/nmstate-console-plugin/0.log" Mar 20 16:29:05 crc kubenswrapper[4675]: I0320 16:29:05.797628 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-422rl_13927470-4093-4096-8a10-e1bdc0443571/nmstate-handler/0.log" Mar 20 16:29:05 crc kubenswrapper[4675]: I0320 16:29:05.882013 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-t7r64_e4f94194-a598-4ec7-aac0-ba8e1c3e3e34/kube-rbac-proxy/0.log" Mar 20 16:29:05 crc kubenswrapper[4675]: I0320 16:29:05.923886 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-t7r64_e4f94194-a598-4ec7-aac0-ba8e1c3e3e34/nmstate-metrics/0.log" Mar 20 16:29:06 crc kubenswrapper[4675]: I0320 16:29:06.021606 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-4xn7f_193d5d89-d3ca-4090-abdb-284ad7cc91f9/nmstate-operator/0.log" Mar 20 16:29:06 crc kubenswrapper[4675]: I0320 16:29:06.160939 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-hc7pt_84187302-9f43-4f51-882d-d1cbdb1e22a3/nmstate-webhook/0.log" Mar 20 16:29:14 crc kubenswrapper[4675]: I0320 16:29:14.674253 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:29:14 crc kubenswrapper[4675]: E0320 16:29:14.677166 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:29:27 crc kubenswrapper[4675]: I0320 16:29:27.674379 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:29:27 crc kubenswrapper[4675]: E0320 16:29:27.676279 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:29:32 crc kubenswrapper[4675]: I0320 16:29:32.927461 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-phstr_f5a8475e-06f7-4340-ba60-cc910fb3e2c5/kube-rbac-proxy/0.log" Mar 20 16:29:32 crc kubenswrapper[4675]: I0320 16:29:32.986354 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-phstr_f5a8475e-06f7-4340-ba60-cc910fb3e2c5/controller/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.120785 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-frr-files/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.306482 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-frr-files/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.310347 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-metrics/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.325313 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-reloader/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.381819 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-reloader/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.504945 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-reloader/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.505997 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-metrics/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.537505 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-frr-files/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.555276 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-metrics/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.727602 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-reloader/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.731351 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-frr-files/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.744565 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/cp-metrics/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.745103 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/controller/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.899402 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/frr-metrics/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.920722 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/kube-rbac-proxy/0.log" Mar 20 16:29:33 crc kubenswrapper[4675]: I0320 16:29:33.943065 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/kube-rbac-proxy-frr/0.log" Mar 20 16:29:34 crc kubenswrapper[4675]: I0320 16:29:34.116050 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jggh8_f44b2431-e7ee-41a8-98c0-957a933f8cd8/frr-k8s-webhook-server/0.log" Mar 20 16:29:34 crc kubenswrapper[4675]: I0320 16:29:34.118878 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/reloader/0.log" Mar 20 16:29:34 crc kubenswrapper[4675]: I0320 16:29:34.360909 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68f54df857-mz9xg_3e995caa-02b4-47fa-9e1f-2f40a8234f0c/manager/0.log" Mar 20 16:29:34 crc kubenswrapper[4675]: I0320 16:29:34.531497 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-859fb75b66-mjggm_2069d30f-51d2-4294-a579-cb0843239946/webhook-server/0.log" Mar 20 16:29:34 crc kubenswrapper[4675]: I0320 16:29:34.568007 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wcb96_79348741-9f2d-4c18-be91-9a49e0af8ec8/kube-rbac-proxy/0.log" Mar 20 16:29:34 crc kubenswrapper[4675]: I0320 16:29:34.844844 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64lgp_d2ccda47-7e80-4cab-8b14-fffa6e1a73b2/frr/0.log" Mar 20 16:29:35 crc kubenswrapper[4675]: I0320 16:29:35.067064 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wcb96_79348741-9f2d-4c18-be91-9a49e0af8ec8/speaker/0.log" Mar 20 16:29:41 crc kubenswrapper[4675]: I0320 16:29:41.674193 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:29:41 crc kubenswrapper[4675]: E0320 16:29:41.674959 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.458808 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/util/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.619611 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/util/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.645986 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/pull/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.685249 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/pull/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.816827 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/pull/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.833758 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/util/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.859478 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874jzrr4_b1368284-4a62-4656-b794-96aa0ed4e775/extract/0.log" Mar 20 16:29:46 crc kubenswrapper[4675]: I0320 16:29:46.979055 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/util/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.156485 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/util/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.172633 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/pull/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.176845 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/pull/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.393354 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/util/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.394531 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/extract/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.394616 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hwmxz_243915a2-beb9-4d55-914c-6c27c64ee50a/pull/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.553477 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/extract-utilities/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.709227 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/extract-utilities/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.720386 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/extract-content/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.734457 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/extract-content/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.896615 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/extract-utilities/0.log" Mar 20 16:29:47 crc kubenswrapper[4675]: I0320 16:29:47.920291 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/extract-content/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.160928 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/extract-utilities/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.174339 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6fszb_2420ca6e-21bb-4804-b714-b0ac748a5d4a/registry-server/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.300539 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/extract-content/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.324408 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/extract-utilities/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.356541 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/extract-content/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.542777 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/extract-utilities/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.544458 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/extract-content/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.777484 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xhj8r_d62ebd56-8ca5-4cbe-b7af-15d2164540fe/marketplace-operator/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.816029 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/extract-utilities/0.log" Mar 20 16:29:48 crc kubenswrapper[4675]: I0320 16:29:48.910941 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvl9k_d94b5a9e-b23c-497a-befb-9e5df7a05b76/registry-server/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.053086 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/extract-utilities/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.093158 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/extract-content/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.109488 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/extract-content/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.258780 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/extract-utilities/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.260894 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/extract-content/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.371073 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4ml97_a7348ed2-98c6-4d2f-b738-98961986accc/registry-server/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.468435 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/extract-utilities/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.641645 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/extract-utilities/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.667799 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/extract-content/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.685536 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/extract-content/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.853707 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/extract-content/0.log" Mar 20 16:29:49 crc kubenswrapper[4675]: I0320 16:29:49.875787 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/extract-utilities/0.log" Mar 20 16:29:50 crc kubenswrapper[4675]: I0320 16:29:50.074431 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pgdms_ae86b573-cc65-4b83-b812-9b74eaefbe62/registry-server/0.log" Mar 20 16:29:54 crc kubenswrapper[4675]: I0320 16:29:54.674014 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:29:54 crc kubenswrapper[4675]: E0320 16:29:54.675033 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:29:57 crc kubenswrapper[4675]: I0320 16:29:57.510573 4675 scope.go:117] "RemoveContainer" containerID="df748cc02568643b3a9cce8a55dbb5a468996406070e638b5283f7c5472bd336" Mar 20 16:29:57 crc kubenswrapper[4675]: I0320 16:29:57.536781 4675 scope.go:117] "RemoveContainer" containerID="c09a9f444e1923fddf3375475305a15ea2cd2303959f75ad89ff5686871d8663" Mar 20 16:29:57 crc kubenswrapper[4675]: I0320 16:29:57.565989 4675 scope.go:117] "RemoveContainer" containerID="cf7887f1d141b695f121e7d572b7bf756afcb691fd7f4c8fa0c4708f5b4b512b" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.146600 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567070-f5lvt"] Mar 20 16:30:00 crc kubenswrapper[4675]: E0320 16:30:00.147506 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17750158-67c7-45d9-9ed2-ae8f2640dc0b" containerName="oc" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.147524 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="17750158-67c7-45d9-9ed2-ae8f2640dc0b" containerName="oc" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.147793 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="17750158-67c7-45d9-9ed2-ae8f2640dc0b" containerName="oc" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.148591 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.151533 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.151868 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.152616 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.157537 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648"] Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.159217 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.161432 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.161634 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.168958 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-f5lvt"] Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.179417 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648"] Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.253513 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4zrs\" (UniqueName: \"kubernetes.io/projected/d554075d-9af8-4a82-ad74-7d28cf8e84e3-kube-api-access-m4zrs\") pod \"auto-csr-approver-29567070-f5lvt\" (UID: \"d554075d-9af8-4a82-ad74-7d28cf8e84e3\") " pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.355461 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-config-volume\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.355564 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69qt\" (UniqueName: \"kubernetes.io/projected/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-kube-api-access-j69qt\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.355615 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4zrs\" (UniqueName: \"kubernetes.io/projected/d554075d-9af8-4a82-ad74-7d28cf8e84e3-kube-api-access-m4zrs\") pod \"auto-csr-approver-29567070-f5lvt\" (UID: \"d554075d-9af8-4a82-ad74-7d28cf8e84e3\") " pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.355699 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-secret-volume\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.378022 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4zrs\" (UniqueName: \"kubernetes.io/projected/d554075d-9af8-4a82-ad74-7d28cf8e84e3-kube-api-access-m4zrs\") pod \"auto-csr-approver-29567070-f5lvt\" (UID: \"d554075d-9af8-4a82-ad74-7d28cf8e84e3\") " pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.457508 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-config-volume\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.457829 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69qt\" (UniqueName: \"kubernetes.io/projected/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-kube-api-access-j69qt\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.457900 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-secret-volume\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.458702 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-config-volume\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.462060 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-secret-volume\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.475804 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.478700 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69qt\" (UniqueName: \"kubernetes.io/projected/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-kube-api-access-j69qt\") pod \"collect-profiles-29567070-c9648\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.498881 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:00 crc kubenswrapper[4675]: I0320 16:30:00.985674 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648"] Mar 20 16:30:01 crc kubenswrapper[4675]: I0320 16:30:01.082294 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-f5lvt"] Mar 20 16:30:01 crc kubenswrapper[4675]: W0320 16:30:01.089678 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd554075d_9af8_4a82_ad74_7d28cf8e84e3.slice/crio-69823e2d8c9e01a6a58002a62d7f29f9cf87d5a80d603bf7dd574859e5494b2b WatchSource:0}: Error finding container 69823e2d8c9e01a6a58002a62d7f29f9cf87d5a80d603bf7dd574859e5494b2b: Status 404 returned error can't find the container with id 69823e2d8c9e01a6a58002a62d7f29f9cf87d5a80d603bf7dd574859e5494b2b Mar 20 16:30:01 crc kubenswrapper[4675]: I0320 16:30:01.779971 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" event={"ID":"d554075d-9af8-4a82-ad74-7d28cf8e84e3","Type":"ContainerStarted","Data":"69823e2d8c9e01a6a58002a62d7f29f9cf87d5a80d603bf7dd574859e5494b2b"} Mar 20 16:30:01 crc kubenswrapper[4675]: I0320 16:30:01.783432 4675 generic.go:334] "Generic (PLEG): container finished" podID="5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" containerID="146094cb980a76b5f23b6aa1239af7aca0fcaca5852553acaaef1b8bb2e93712" exitCode=0 Mar 20 16:30:01 crc kubenswrapper[4675]: I0320 16:30:01.783584 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" event={"ID":"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c","Type":"ContainerDied","Data":"146094cb980a76b5f23b6aa1239af7aca0fcaca5852553acaaef1b8bb2e93712"} Mar 20 16:30:01 crc kubenswrapper[4675]: I0320 16:30:01.783735 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" event={"ID":"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c","Type":"ContainerStarted","Data":"af181234d4bd48d9a41b129fbe24020c3b5ba7d95cbce7575d42d83e2425fcb6"} Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.130501 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.310199 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-secret-volume\") pod \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.310266 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-config-volume\") pod \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.310348 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j69qt\" (UniqueName: \"kubernetes.io/projected/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-kube-api-access-j69qt\") pod \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\" (UID: \"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c\") " Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.311190 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-config-volume" (OuterVolumeSpecName: "config-volume") pod "5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" (UID: "5b62a7c5-5ec8-494e-942f-ffc35cb6f63c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.316266 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" (UID: "5b62a7c5-5ec8-494e-942f-ffc35cb6f63c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.318507 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-kube-api-access-j69qt" (OuterVolumeSpecName: "kube-api-access-j69qt") pod "5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" (UID: "5b62a7c5-5ec8-494e-942f-ffc35cb6f63c"). InnerVolumeSpecName "kube-api-access-j69qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.412593 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j69qt\" (UniqueName: \"kubernetes.io/projected/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-kube-api-access-j69qt\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.412635 4675 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.412647 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b62a7c5-5ec8-494e-942f-ffc35cb6f63c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.803751 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" event={"ID":"5b62a7c5-5ec8-494e-942f-ffc35cb6f63c","Type":"ContainerDied","Data":"af181234d4bd48d9a41b129fbe24020c3b5ba7d95cbce7575d42d83e2425fcb6"} Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.803810 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af181234d4bd48d9a41b129fbe24020c3b5ba7d95cbce7575d42d83e2425fcb6" Mar 20 16:30:03 crc kubenswrapper[4675]: I0320 16:30:03.804092 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-c9648" Mar 20 16:30:04 crc kubenswrapper[4675]: E0320 16:30:04.718086 4675 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.234:57562->38.102.83.234:41845: read tcp 38.102.83.234:57562->38.102.83.234:41845: read: connection reset by peer Mar 20 16:30:05 crc kubenswrapper[4675]: I0320 16:30:05.844065 4675 generic.go:334] "Generic (PLEG): container finished" podID="d554075d-9af8-4a82-ad74-7d28cf8e84e3" containerID="31c606593aba6c7e73e7fca3526b063f6a6c60dec3746b6f8d6231d425bffedd" exitCode=0 Mar 20 16:30:05 crc kubenswrapper[4675]: I0320 16:30:05.844157 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" event={"ID":"d554075d-9af8-4a82-ad74-7d28cf8e84e3","Type":"ContainerDied","Data":"31c606593aba6c7e73e7fca3526b063f6a6c60dec3746b6f8d6231d425bffedd"} Mar 20 16:30:06 crc kubenswrapper[4675]: I0320 16:30:06.676328 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:30:06 crc kubenswrapper[4675]: E0320 16:30:06.676530 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.231945 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.393193 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4zrs\" (UniqueName: \"kubernetes.io/projected/d554075d-9af8-4a82-ad74-7d28cf8e84e3-kube-api-access-m4zrs\") pod \"d554075d-9af8-4a82-ad74-7d28cf8e84e3\" (UID: \"d554075d-9af8-4a82-ad74-7d28cf8e84e3\") " Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.416015 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d554075d-9af8-4a82-ad74-7d28cf8e84e3-kube-api-access-m4zrs" (OuterVolumeSpecName: "kube-api-access-m4zrs") pod "d554075d-9af8-4a82-ad74-7d28cf8e84e3" (UID: "d554075d-9af8-4a82-ad74-7d28cf8e84e3"). InnerVolumeSpecName "kube-api-access-m4zrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.495481 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4zrs\" (UniqueName: \"kubernetes.io/projected/d554075d-9af8-4a82-ad74-7d28cf8e84e3-kube-api-access-m4zrs\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.861971 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" event={"ID":"d554075d-9af8-4a82-ad74-7d28cf8e84e3","Type":"ContainerDied","Data":"69823e2d8c9e01a6a58002a62d7f29f9cf87d5a80d603bf7dd574859e5494b2b"} Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.862009 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69823e2d8c9e01a6a58002a62d7f29f9cf87d5a80d603bf7dd574859e5494b2b" Mar 20 16:30:07 crc kubenswrapper[4675]: I0320 16:30:07.862263 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-f5lvt" Mar 20 16:30:08 crc kubenswrapper[4675]: I0320 16:30:08.301158 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-dm796"] Mar 20 16:30:08 crc kubenswrapper[4675]: I0320 16:30:08.308488 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-dm796"] Mar 20 16:30:08 crc kubenswrapper[4675]: I0320 16:30:08.685539 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b15ecc4-edbe-4833-9118-7c6a4c7b3352" path="/var/lib/kubelet/pods/7b15ecc4-edbe-4833-9118-7c6a4c7b3352/volumes" Mar 20 16:30:19 crc kubenswrapper[4675]: I0320 16:30:19.674164 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:30:19 crc kubenswrapper[4675]: E0320 16:30:19.674829 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:30:30 crc kubenswrapper[4675]: I0320 16:30:30.682141 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:30:30 crc kubenswrapper[4675]: E0320 16:30:30.683192 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:30:43 crc kubenswrapper[4675]: I0320 16:30:43.673931 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:30:43 crc kubenswrapper[4675]: E0320 16:30:43.674698 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:30:55 crc kubenswrapper[4675]: I0320 16:30:55.674051 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:30:55 crc kubenswrapper[4675]: E0320 16:30:55.674922 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:30:57 crc kubenswrapper[4675]: I0320 16:30:57.674696 4675 scope.go:117] "RemoveContainer" containerID="1bb17a3da7b17bfdae61a39990418f2e8995872083cb61e6b53e6302d30d2242" Mar 20 16:31:08 crc kubenswrapper[4675]: I0320 16:31:08.674066 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:31:08 crc kubenswrapper[4675]: E0320 16:31:08.675105 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.050883 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jrt74"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.059715 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-272fr"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.068559 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qwg4d"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.078934 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3b23-account-create-update-zhhwz"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.088860 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1c94-account-create-update-gs974"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.098277 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1bbb-account-create-update-g82mn"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.108093 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-272fr"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.116344 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jrt74"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.133542 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qwg4d"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.155503 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3b23-account-create-update-zhhwz"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.175475 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1bbb-account-create-update-g82mn"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.186465 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1c94-account-create-update-gs974"] Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.684801 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aee7127-2ed6-4ece-9eaa-0dfda0be02ad" path="/var/lib/kubelet/pods/4aee7127-2ed6-4ece-9eaa-0dfda0be02ad/volumes" Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.685890 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561932af-1ef9-47ff-9da6-b661477b60ae" path="/var/lib/kubelet/pods/561932af-1ef9-47ff-9da6-b661477b60ae/volumes" Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.686718 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7deb55-a7bb-4207-822f-c348a40ee473" path="/var/lib/kubelet/pods/5c7deb55-a7bb-4207-822f-c348a40ee473/volumes" Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.687464 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ccf7599-4ec9-4023-b6a4-9517656c82f8" path="/var/lib/kubelet/pods/5ccf7599-4ec9-4023-b6a4-9517656c82f8/volumes" Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.688731 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d96077-f705-44cc-a64d-dd4d7df551a6" path="/var/lib/kubelet/pods/b1d96077-f705-44cc-a64d-dd4d7df551a6/volumes" Mar 20 16:31:14 crc kubenswrapper[4675]: I0320 16:31:14.689438 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a8634b-cae4-44ff-b49b-3c2c12ef93fe" path="/var/lib/kubelet/pods/b5a8634b-cae4-44ff-b49b-3c2c12ef93fe/volumes" Mar 20 16:31:20 crc kubenswrapper[4675]: I0320 16:31:20.699309 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:31:20 crc kubenswrapper[4675]: E0320 16:31:20.700284 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:31:21 crc kubenswrapper[4675]: I0320 16:31:21.029638 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zdp9x"] Mar 20 16:31:21 crc kubenswrapper[4675]: I0320 16:31:21.039746 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zdp9x"] Mar 20 16:31:22 crc kubenswrapper[4675]: I0320 16:31:22.685015 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a4f6bb-496f-4c50-9047-057827aefe77" path="/var/lib/kubelet/pods/36a4f6bb-496f-4c50-9047-057827aefe77/volumes" Mar 20 16:31:31 crc kubenswrapper[4675]: I0320 16:31:31.666588 4675 generic.go:334] "Generic (PLEG): container finished" podID="a031bc01-da3d-4159-969f-a7509db918cd" containerID="6e91531eff589939311fc0dbd4877952a478b0f4b1e30c85650b4e35d1d2a294" exitCode=0 Mar 20 16:31:31 crc kubenswrapper[4675]: I0320 16:31:31.666654 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" event={"ID":"a031bc01-da3d-4159-969f-a7509db918cd","Type":"ContainerDied","Data":"6e91531eff589939311fc0dbd4877952a478b0f4b1e30c85650b4e35d1d2a294"} Mar 20 16:31:31 crc kubenswrapper[4675]: I0320 16:31:31.667601 4675 scope.go:117] "RemoveContainer" containerID="6e91531eff589939311fc0dbd4877952a478b0f4b1e30c85650b4e35d1d2a294" Mar 20 16:31:32 crc kubenswrapper[4675]: I0320 16:31:32.528323 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t5gcc_must-gather-sl9zj_a031bc01-da3d-4159-969f-a7509db918cd/gather/0.log" Mar 20 16:31:35 crc kubenswrapper[4675]: I0320 16:31:35.673670 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:31:35 crc kubenswrapper[4675]: E0320 16:31:35.674367 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.308132 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t5gcc/must-gather-sl9zj"] Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.308897 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="copy" containerID="cri-o://ea88def3f8b5e08517350161e03768e97f3dd7bbc04f107671326230a065b749" gracePeriod=2 Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.318499 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t5gcc/must-gather-sl9zj"] Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.743815 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t5gcc_must-gather-sl9zj_a031bc01-da3d-4159-969f-a7509db918cd/copy/0.log" Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.744426 4675 generic.go:334] "Generic (PLEG): container finished" podID="a031bc01-da3d-4159-969f-a7509db918cd" containerID="ea88def3f8b5e08517350161e03768e97f3dd7bbc04f107671326230a065b749" exitCode=143 Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.744471 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15ab304a55bdb8c83e6a878a3bd05ad134e0ed3888ed4a23c3210caa1fe937e9" Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.763948 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t5gcc_must-gather-sl9zj_a031bc01-da3d-4159-969f-a7509db918cd/copy/0.log" Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.764320 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.921099 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-955s6\" (UniqueName: \"kubernetes.io/projected/a031bc01-da3d-4159-969f-a7509db918cd-kube-api-access-955s6\") pod \"a031bc01-da3d-4159-969f-a7509db918cd\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.921387 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a031bc01-da3d-4159-969f-a7509db918cd-must-gather-output\") pod \"a031bc01-da3d-4159-969f-a7509db918cd\" (UID: \"a031bc01-da3d-4159-969f-a7509db918cd\") " Mar 20 16:31:40 crc kubenswrapper[4675]: I0320 16:31:40.936981 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a031bc01-da3d-4159-969f-a7509db918cd-kube-api-access-955s6" (OuterVolumeSpecName: "kube-api-access-955s6") pod "a031bc01-da3d-4159-969f-a7509db918cd" (UID: "a031bc01-da3d-4159-969f-a7509db918cd"). InnerVolumeSpecName "kube-api-access-955s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:31:41 crc kubenswrapper[4675]: I0320 16:31:41.023196 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-955s6\" (UniqueName: \"kubernetes.io/projected/a031bc01-da3d-4159-969f-a7509db918cd-kube-api-access-955s6\") on node \"crc\" DevicePath \"\"" Mar 20 16:31:41 crc kubenswrapper[4675]: I0320 16:31:41.050396 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a031bc01-da3d-4159-969f-a7509db918cd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a031bc01-da3d-4159-969f-a7509db918cd" (UID: "a031bc01-da3d-4159-969f-a7509db918cd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:31:41 crc kubenswrapper[4675]: I0320 16:31:41.124743 4675 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a031bc01-da3d-4159-969f-a7509db918cd-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 16:31:41 crc kubenswrapper[4675]: I0320 16:31:41.751043 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t5gcc/must-gather-sl9zj" Mar 20 16:31:42 crc kubenswrapper[4675]: I0320 16:31:42.683685 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a031bc01-da3d-4159-969f-a7509db918cd" path="/var/lib/kubelet/pods/a031bc01-da3d-4159-969f-a7509db918cd/volumes" Mar 20 16:31:46 crc kubenswrapper[4675]: I0320 16:31:46.041787 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ddcp2"] Mar 20 16:31:46 crc kubenswrapper[4675]: I0320 16:31:46.051342 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ddcp2"] Mar 20 16:31:46 crc kubenswrapper[4675]: I0320 16:31:46.674246 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:31:46 crc kubenswrapper[4675]: E0320 16:31:46.675020 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:31:46 crc kubenswrapper[4675]: I0320 16:31:46.686285 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e8338f-ae50-4341-aba0-91bf9890a9bc" path="/var/lib/kubelet/pods/97e8338f-ae50-4341-aba0-91bf9890a9bc/volumes" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.566958 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xkqbm"] Mar 20 16:31:52 crc kubenswrapper[4675]: E0320 16:31:52.570405 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="copy" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570442 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="copy" Mar 20 16:31:52 crc kubenswrapper[4675]: E0320 16:31:52.570464 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" containerName="collect-profiles" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570472 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" containerName="collect-profiles" Mar 20 16:31:52 crc kubenswrapper[4675]: E0320 16:31:52.570489 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="gather" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570505 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="gather" Mar 20 16:31:52 crc kubenswrapper[4675]: E0320 16:31:52.570531 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d554075d-9af8-4a82-ad74-7d28cf8e84e3" containerName="oc" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570538 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d554075d-9af8-4a82-ad74-7d28cf8e84e3" containerName="oc" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570733 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="copy" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570745 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b62a7c5-5ec8-494e-942f-ffc35cb6f63c" containerName="collect-profiles" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570757 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a031bc01-da3d-4159-969f-a7509db918cd" containerName="gather" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.570779 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d554075d-9af8-4a82-ad74-7d28cf8e84e3" containerName="oc" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.572221 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.733544 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x69xs\" (UniqueName: \"kubernetes.io/projected/6e200798-43db-45ac-9610-c865f931df27-kube-api-access-x69xs\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.733950 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-utilities\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.733975 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-catalog-content\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.823266 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkqbm"] Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.835371 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-utilities\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.835411 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-catalog-content\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.835502 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x69xs\" (UniqueName: \"kubernetes.io/projected/6e200798-43db-45ac-9610-c865f931df27-kube-api-access-x69xs\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.836226 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-catalog-content\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.838292 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-utilities\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:52 crc kubenswrapper[4675]: I0320 16:31:52.863270 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x69xs\" (UniqueName: \"kubernetes.io/projected/6e200798-43db-45ac-9610-c865f931df27-kube-api-access-x69xs\") pod \"redhat-marketplace-xkqbm\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.106998 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.159199 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xgplq"] Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.164592 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.172261 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xgplq"] Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.244821 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-catalog-content\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.245232 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txlfw\" (UniqueName: \"kubernetes.io/projected/79d3ace4-0761-4de9-a8b4-9b5d104607b0-kube-api-access-txlfw\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.245277 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-utilities\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.347412 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txlfw\" (UniqueName: \"kubernetes.io/projected/79d3ace4-0761-4de9-a8b4-9b5d104607b0-kube-api-access-txlfw\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.347681 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-utilities\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.347807 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-catalog-content\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.348428 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-catalog-content\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.349107 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-utilities\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.375942 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txlfw\" (UniqueName: \"kubernetes.io/projected/79d3ace4-0761-4de9-a8b4-9b5d104607b0-kube-api-access-txlfw\") pod \"redhat-operators-xgplq\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.561311 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.714049 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkqbm"] Mar 20 16:31:53 crc kubenswrapper[4675]: I0320 16:31:53.874664 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkqbm" event={"ID":"6e200798-43db-45ac-9610-c865f931df27","Type":"ContainerStarted","Data":"eee7dbd924e8834cc05ef12523f8378dffca45b1a2fce5388c7d9ff2d6624c75"} Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.034662 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xgplq"] Mar 20 16:31:54 crc kubenswrapper[4675]: W0320 16:31:54.035945 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d3ace4_0761_4de9_a8b4_9b5d104607b0.slice/crio-392e15e62a273e4d84ecd752e0a38d4fd1867de63ff1c306e62df7faef8937ec WatchSource:0}: Error finding container 392e15e62a273e4d84ecd752e0a38d4fd1867de63ff1c306e62df7faef8937ec: Status 404 returned error can't find the container with id 392e15e62a273e4d84ecd752e0a38d4fd1867de63ff1c306e62df7faef8937ec Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.884384 4675 generic.go:334] "Generic (PLEG): container finished" podID="6e200798-43db-45ac-9610-c865f931df27" containerID="db4c8d79a2a7b81a5101743fdfe2c4df207f4c367832efb8ee67026ffdb3ffb8" exitCode=0 Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.884439 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkqbm" event={"ID":"6e200798-43db-45ac-9610-c865f931df27","Type":"ContainerDied","Data":"db4c8d79a2a7b81a5101743fdfe2c4df207f4c367832efb8ee67026ffdb3ffb8"} Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.885848 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.887059 4675 generic.go:334] "Generic (PLEG): container finished" podID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerID="a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e" exitCode=0 Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.887096 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerDied","Data":"a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e"} Mar 20 16:31:54 crc kubenswrapper[4675]: I0320 16:31:54.887123 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerStarted","Data":"392e15e62a273e4d84ecd752e0a38d4fd1867de63ff1c306e62df7faef8937ec"} Mar 20 16:31:55 crc kubenswrapper[4675]: I0320 16:31:55.897508 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerStarted","Data":"5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113"} Mar 20 16:31:56 crc kubenswrapper[4675]: I0320 16:31:56.910190 4675 generic.go:334] "Generic (PLEG): container finished" podID="6e200798-43db-45ac-9610-c865f931df27" containerID="ffbf253a7a4c63fb140e389cdb4714eb3ff527f9376882c214f76fccd927828c" exitCode=0 Mar 20 16:31:56 crc kubenswrapper[4675]: I0320 16:31:56.910296 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkqbm" event={"ID":"6e200798-43db-45ac-9610-c865f931df27","Type":"ContainerDied","Data":"ffbf253a7a4c63fb140e389cdb4714eb3ff527f9376882c214f76fccd927828c"} Mar 20 16:31:56 crc kubenswrapper[4675]: I0320 16:31:56.915635 4675 generic.go:334] "Generic (PLEG): container finished" podID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerID="5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113" exitCode=0 Mar 20 16:31:56 crc kubenswrapper[4675]: I0320 16:31:56.915672 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerDied","Data":"5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113"} Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.674910 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:31:57 crc kubenswrapper[4675]: E0320 16:31:57.675431 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.761142 4675 scope.go:117] "RemoveContainer" containerID="c2f1468059e8174d62f1387a368b5742bd91d55c9e02756c2b1915074ea029d8" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.782386 4675 scope.go:117] "RemoveContainer" containerID="fd6c82753ba044d5aa2f2b747fe88d0159a1d37e2f1fc156cf3becb03ef96b18" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.799757 4675 scope.go:117] "RemoveContainer" containerID="164a1ac489e08c78060220608e53c3adc593ec270e34d9810845727dbf5269e0" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.848220 4675 scope.go:117] "RemoveContainer" containerID="0c8ae4b715416c15fca67bb4b80c2373fed733c878ef6efd9de39e94eee7e17f" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.880476 4675 scope.go:117] "RemoveContainer" containerID="6f26dd6172245e083d07812b969d3f493470dca054220bdcc2a5aec2369262d3" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.901662 4675 scope.go:117] "RemoveContainer" containerID="bd0c8ed35413f56ff4910847636a652a035f6c024fc37b2bdcfd71375215187a" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.932289 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkqbm" event={"ID":"6e200798-43db-45ac-9610-c865f931df27","Type":"ContainerStarted","Data":"b1d0e98184dda2aa20f118a057420086a22dd32ac87ee1607568378685c8bc81"} Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.945106 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerStarted","Data":"0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8"} Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.956143 4675 scope.go:117] "RemoveContainer" containerID="1e65565122945947fdb277280a030c7f0fe3a61be1efb56485c1efb8eef89ea9" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.958910 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xkqbm" podStartSLOduration=3.50717076 podStartE2EDuration="5.958891196s" podCreationTimestamp="2026-03-20 16:31:52 +0000 UTC" firstStartedPulling="2026-03-20 16:31:54.885551372 +0000 UTC m=+1834.919180909" lastFinishedPulling="2026-03-20 16:31:57.337271808 +0000 UTC m=+1837.370901345" observedRunningTime="2026-03-20 16:31:57.948529243 +0000 UTC m=+1837.982158780" watchObservedRunningTime="2026-03-20 16:31:57.958891196 +0000 UTC m=+1837.992520723" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.980727 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xgplq" podStartSLOduration=2.482629816 podStartE2EDuration="4.980709652s" podCreationTimestamp="2026-03-20 16:31:53 +0000 UTC" firstStartedPulling="2026-03-20 16:31:54.888423213 +0000 UTC m=+1834.922052750" lastFinishedPulling="2026-03-20 16:31:57.386503049 +0000 UTC m=+1837.420132586" observedRunningTime="2026-03-20 16:31:57.969710391 +0000 UTC m=+1838.003339928" watchObservedRunningTime="2026-03-20 16:31:57.980709652 +0000 UTC m=+1838.014339189" Mar 20 16:31:57 crc kubenswrapper[4675]: I0320 16:31:57.995751 4675 scope.go:117] "RemoveContainer" containerID="ada7ba2162a6e798a495f93e6fe37af99800ac108c88fcde02f956f56c55f91b" Mar 20 16:31:58 crc kubenswrapper[4675]: I0320 16:31:58.018805 4675 scope.go:117] "RemoveContainer" containerID="4bb3915f874f97ba51894484158e8d30440cf0a06fc3802a6cf5b8a6e55bec67" Mar 20 16:31:58 crc kubenswrapper[4675]: I0320 16:31:58.037186 4675 scope.go:117] "RemoveContainer" containerID="f7f5d86c4305bed12ce387b9b14b043be57f9c9d26cfbfe21b72915cc0bf1d29" Mar 20 16:31:58 crc kubenswrapper[4675]: I0320 16:31:58.057390 4675 scope.go:117] "RemoveContainer" containerID="497f4d7c1119b9973b58b8991ef37976e205dca3b190cd6e110a7aa00c073e7d" Mar 20 16:31:58 crc kubenswrapper[4675]: I0320 16:31:58.076319 4675 scope.go:117] "RemoveContainer" containerID="36ce2698098692684c91e2c881dfc739ce38c9a1f105bee77f181b0ae416ac10" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.166553 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567072-cb6m6"] Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.172385 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.175361 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.175550 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.175728 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.187539 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-cb6m6"] Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.302541 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zx25\" (UniqueName: \"kubernetes.io/projected/52326f93-07dd-4632-9f5d-8e544a9fd4e5-kube-api-access-4zx25\") pod \"auto-csr-approver-29567072-cb6m6\" (UID: \"52326f93-07dd-4632-9f5d-8e544a9fd4e5\") " pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.405780 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zx25\" (UniqueName: \"kubernetes.io/projected/52326f93-07dd-4632-9f5d-8e544a9fd4e5-kube-api-access-4zx25\") pod \"auto-csr-approver-29567072-cb6m6\" (UID: \"52326f93-07dd-4632-9f5d-8e544a9fd4e5\") " pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.429299 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zx25\" (UniqueName: \"kubernetes.io/projected/52326f93-07dd-4632-9f5d-8e544a9fd4e5-kube-api-access-4zx25\") pod \"auto-csr-approver-29567072-cb6m6\" (UID: \"52326f93-07dd-4632-9f5d-8e544a9fd4e5\") " pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.516992 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:00 crc kubenswrapper[4675]: I0320 16:32:00.989888 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-cb6m6"] Mar 20 16:32:01 crc kubenswrapper[4675]: I0320 16:32:01.982461 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" event={"ID":"52326f93-07dd-4632-9f5d-8e544a9fd4e5","Type":"ContainerStarted","Data":"e8ecb4e3e2075e3f859c5c86c2d73e59687eb043a6a57b19fd7eae4f6f194142"} Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.044331 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d571-account-create-update-ghkph"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.055475 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jp9g7"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.070256 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b060-account-create-update-5gxch"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.078126 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zj9wf"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.088899 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jp9g7"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.094655 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d571-account-create-update-ghkph"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.105002 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b060-account-create-update-5gxch"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.107983 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.108026 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.113365 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7038-account-create-update-9c4gc"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.122157 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zj9wf"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.129802 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7038-account-create-update-9c4gc"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.140060 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-kwslq"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.150387 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-kwslq"] Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.154818 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.561699 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:32:03 crc kubenswrapper[4675]: I0320 16:32:03.562059 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.052276 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.105185 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkqbm"] Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.642518 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xgplq" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="registry-server" probeResult="failure" output=< Mar 20 16:32:04 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Mar 20 16:32:04 crc kubenswrapper[4675]: > Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.683707 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6fe44b-0699-4235-af89-d546820b782a" path="/var/lib/kubelet/pods/2e6fe44b-0699-4235-af89-d546820b782a/volumes" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.684584 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b97a81-a094-4cb5-ba97-33eb354c1d97" path="/var/lib/kubelet/pods/42b97a81-a094-4cb5-ba97-33eb354c1d97/volumes" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.685288 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c297c7-2162-4bd6-bd83-db8bbc61d008" path="/var/lib/kubelet/pods/45c297c7-2162-4bd6-bd83-db8bbc61d008/volumes" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.685894 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931f540a-ffd9-4d4a-b001-f68408fa02fb" path="/var/lib/kubelet/pods/931f540a-ffd9-4d4a-b001-f68408fa02fb/volumes" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.687353 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9503b7d8-c02c-4d31-9711-271cd2be4778" path="/var/lib/kubelet/pods/9503b7d8-c02c-4d31-9711-271cd2be4778/volumes" Mar 20 16:32:04 crc kubenswrapper[4675]: I0320 16:32:04.688072 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b7b710-7048-42c1-8215-c242f34da40f" path="/var/lib/kubelet/pods/a0b7b710-7048-42c1-8215-c242f34da40f/volumes" Mar 20 16:32:05 crc kubenswrapper[4675]: I0320 16:32:05.015556 4675 generic.go:334] "Generic (PLEG): container finished" podID="52326f93-07dd-4632-9f5d-8e544a9fd4e5" containerID="119c5e35dc4dbfe6869f33c3dc367e7d13fd8ddc1d451929a4194b4f1bb0ebfc" exitCode=0 Mar 20 16:32:05 crc kubenswrapper[4675]: I0320 16:32:05.015628 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" event={"ID":"52326f93-07dd-4632-9f5d-8e544a9fd4e5","Type":"ContainerDied","Data":"119c5e35dc4dbfe6869f33c3dc367e7d13fd8ddc1d451929a4194b4f1bb0ebfc"} Mar 20 16:32:06 crc kubenswrapper[4675]: I0320 16:32:06.027070 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xkqbm" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="registry-server" containerID="cri-o://b1d0e98184dda2aa20f118a057420086a22dd32ac87ee1607568378685c8bc81" gracePeriod=2 Mar 20 16:32:06 crc kubenswrapper[4675]: I0320 16:32:06.377130 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:06 crc kubenswrapper[4675]: I0320 16:32:06.521150 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zx25\" (UniqueName: \"kubernetes.io/projected/52326f93-07dd-4632-9f5d-8e544a9fd4e5-kube-api-access-4zx25\") pod \"52326f93-07dd-4632-9f5d-8e544a9fd4e5\" (UID: \"52326f93-07dd-4632-9f5d-8e544a9fd4e5\") " Mar 20 16:32:06 crc kubenswrapper[4675]: I0320 16:32:06.527220 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52326f93-07dd-4632-9f5d-8e544a9fd4e5-kube-api-access-4zx25" (OuterVolumeSpecName: "kube-api-access-4zx25") pod "52326f93-07dd-4632-9f5d-8e544a9fd4e5" (UID: "52326f93-07dd-4632-9f5d-8e544a9fd4e5"). InnerVolumeSpecName "kube-api-access-4zx25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:06 crc kubenswrapper[4675]: I0320 16:32:06.623879 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zx25\" (UniqueName: \"kubernetes.io/projected/52326f93-07dd-4632-9f5d-8e544a9fd4e5-kube-api-access-4zx25\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:07 crc kubenswrapper[4675]: I0320 16:32:07.034314 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" event={"ID":"52326f93-07dd-4632-9f5d-8e544a9fd4e5","Type":"ContainerDied","Data":"e8ecb4e3e2075e3f859c5c86c2d73e59687eb043a6a57b19fd7eae4f6f194142"} Mar 20 16:32:07 crc kubenswrapper[4675]: I0320 16:32:07.034362 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-cb6m6" Mar 20 16:32:07 crc kubenswrapper[4675]: I0320 16:32:07.034377 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8ecb4e3e2075e3f859c5c86c2d73e59687eb043a6a57b19fd7eae4f6f194142" Mar 20 16:32:07 crc kubenswrapper[4675]: I0320 16:32:07.444228 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-2sgx8"] Mar 20 16:32:07 crc kubenswrapper[4675]: I0320 16:32:07.456223 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-2sgx8"] Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.033330 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wsz8z"] Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.041596 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wsz8z"] Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.046084 4675 generic.go:334] "Generic (PLEG): container finished" podID="6e200798-43db-45ac-9610-c865f931df27" containerID="b1d0e98184dda2aa20f118a057420086a22dd32ac87ee1607568378685c8bc81" exitCode=0 Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.046140 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkqbm" event={"ID":"6e200798-43db-45ac-9610-c865f931df27","Type":"ContainerDied","Data":"b1d0e98184dda2aa20f118a057420086a22dd32ac87ee1607568378685c8bc81"} Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.523600 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.661016 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x69xs\" (UniqueName: \"kubernetes.io/projected/6e200798-43db-45ac-9610-c865f931df27-kube-api-access-x69xs\") pod \"6e200798-43db-45ac-9610-c865f931df27\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.661212 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-catalog-content\") pod \"6e200798-43db-45ac-9610-c865f931df27\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.661352 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-utilities\") pod \"6e200798-43db-45ac-9610-c865f931df27\" (UID: \"6e200798-43db-45ac-9610-c865f931df27\") " Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.662433 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-utilities" (OuterVolumeSpecName: "utilities") pod "6e200798-43db-45ac-9610-c865f931df27" (UID: "6e200798-43db-45ac-9610-c865f931df27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.668699 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e200798-43db-45ac-9610-c865f931df27-kube-api-access-x69xs" (OuterVolumeSpecName: "kube-api-access-x69xs") pod "6e200798-43db-45ac-9610-c865f931df27" (UID: "6e200798-43db-45ac-9610-c865f931df27"). InnerVolumeSpecName "kube-api-access-x69xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.685624 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f81f694-f0b5-4f31-a090-748418d6fd08" path="/var/lib/kubelet/pods/3f81f694-f0b5-4f31-a090-748418d6fd08/volumes" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.687199 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68eaed96-a9c7-4876-9610-73f080a5372b" path="/var/lib/kubelet/pods/68eaed96-a9c7-4876-9610-73f080a5372b/volumes" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.691077 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e200798-43db-45ac-9610-c865f931df27" (UID: "6e200798-43db-45ac-9610-c865f931df27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.763666 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.763697 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e200798-43db-45ac-9610-c865f931df27-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:08 crc kubenswrapper[4675]: I0320 16:32:08.763707 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x69xs\" (UniqueName: \"kubernetes.io/projected/6e200798-43db-45ac-9610-c865f931df27-kube-api-access-x69xs\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.058266 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkqbm" event={"ID":"6e200798-43db-45ac-9610-c865f931df27","Type":"ContainerDied","Data":"eee7dbd924e8834cc05ef12523f8378dffca45b1a2fce5388c7d9ff2d6624c75"} Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.058336 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkqbm" Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.058341 4675 scope.go:117] "RemoveContainer" containerID="b1d0e98184dda2aa20f118a057420086a22dd32ac87ee1607568378685c8bc81" Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.075156 4675 scope.go:117] "RemoveContainer" containerID="ffbf253a7a4c63fb140e389cdb4714eb3ff527f9376882c214f76fccd927828c" Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.099211 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkqbm"] Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.107188 4675 scope.go:117] "RemoveContainer" containerID="db4c8d79a2a7b81a5101743fdfe2c4df207f4c367832efb8ee67026ffdb3ffb8" Mar 20 16:32:09 crc kubenswrapper[4675]: I0320 16:32:09.111303 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkqbm"] Mar 20 16:32:10 crc kubenswrapper[4675]: I0320 16:32:10.674101 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:32:10 crc kubenswrapper[4675]: E0320 16:32:10.674636 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:32:10 crc kubenswrapper[4675]: I0320 16:32:10.685070 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e200798-43db-45ac-9610-c865f931df27" path="/var/lib/kubelet/pods/6e200798-43db-45ac-9610-c865f931df27/volumes" Mar 20 16:32:13 crc kubenswrapper[4675]: I0320 16:32:13.610946 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:32:13 crc kubenswrapper[4675]: I0320 16:32:13.656049 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:32:13 crc kubenswrapper[4675]: I0320 16:32:13.858972 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xgplq"] Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.121799 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xgplq" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="registry-server" containerID="cri-o://0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8" gracePeriod=2 Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.572365 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.702718 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-utilities\") pod \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.702871 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txlfw\" (UniqueName: \"kubernetes.io/projected/79d3ace4-0761-4de9-a8b4-9b5d104607b0-kube-api-access-txlfw\") pod \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.702989 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-catalog-content\") pod \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\" (UID: \"79d3ace4-0761-4de9-a8b4-9b5d104607b0\") " Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.703636 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-utilities" (OuterVolumeSpecName: "utilities") pod "79d3ace4-0761-4de9-a8b4-9b5d104607b0" (UID: "79d3ace4-0761-4de9-a8b4-9b5d104607b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.708786 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d3ace4-0761-4de9-a8b4-9b5d104607b0-kube-api-access-txlfw" (OuterVolumeSpecName: "kube-api-access-txlfw") pod "79d3ace4-0761-4de9-a8b4-9b5d104607b0" (UID: "79d3ace4-0761-4de9-a8b4-9b5d104607b0"). InnerVolumeSpecName "kube-api-access-txlfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.805392 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.805430 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txlfw\" (UniqueName: \"kubernetes.io/projected/79d3ace4-0761-4de9-a8b4-9b5d104607b0-kube-api-access-txlfw\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.847925 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79d3ace4-0761-4de9-a8b4-9b5d104607b0" (UID: "79d3ace4-0761-4de9-a8b4-9b5d104607b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:15 crc kubenswrapper[4675]: I0320 16:32:15.906697 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d3ace4-0761-4de9-a8b4-9b5d104607b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.132358 4675 generic.go:334] "Generic (PLEG): container finished" podID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerID="0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8" exitCode=0 Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.132398 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerDied","Data":"0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8"} Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.132421 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgplq" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.132437 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgplq" event={"ID":"79d3ace4-0761-4de9-a8b4-9b5d104607b0","Type":"ContainerDied","Data":"392e15e62a273e4d84ecd752e0a38d4fd1867de63ff1c306e62df7faef8937ec"} Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.132489 4675 scope.go:117] "RemoveContainer" containerID="0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.162107 4675 scope.go:117] "RemoveContainer" containerID="5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.174358 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xgplq"] Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.182622 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xgplq"] Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.185157 4675 scope.go:117] "RemoveContainer" containerID="a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.228897 4675 scope.go:117] "RemoveContainer" containerID="0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8" Mar 20 16:32:16 crc kubenswrapper[4675]: E0320 16:32:16.229427 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8\": container with ID starting with 0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8 not found: ID does not exist" containerID="0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.229472 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8"} err="failed to get container status \"0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8\": rpc error: code = NotFound desc = could not find container \"0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8\": container with ID starting with 0f395d3cb13d984ed5196adbd7d45d71e9c6a56846f5f485573109da53583eb8 not found: ID does not exist" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.229500 4675 scope.go:117] "RemoveContainer" containerID="5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113" Mar 20 16:32:16 crc kubenswrapper[4675]: E0320 16:32:16.230225 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113\": container with ID starting with 5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113 not found: ID does not exist" containerID="5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.230267 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113"} err="failed to get container status \"5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113\": rpc error: code = NotFound desc = could not find container \"5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113\": container with ID starting with 5df64c00944e3a737eea133890c23284191306aba6535cd6e0774624da915113 not found: ID does not exist" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.230295 4675 scope.go:117] "RemoveContainer" containerID="a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e" Mar 20 16:32:16 crc kubenswrapper[4675]: E0320 16:32:16.230689 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e\": container with ID starting with a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e not found: ID does not exist" containerID="a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.230728 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e"} err="failed to get container status \"a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e\": rpc error: code = NotFound desc = could not find container \"a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e\": container with ID starting with a66cde6fca55f4efe33fb2b61604991013491672f3e4c911709b593d9c98e00e not found: ID does not exist" Mar 20 16:32:16 crc kubenswrapper[4675]: I0320 16:32:16.690122 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" path="/var/lib/kubelet/pods/79d3ace4-0761-4de9-a8b4-9b5d104607b0/volumes" Mar 20 16:32:24 crc kubenswrapper[4675]: I0320 16:32:24.695185 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:32:24 crc kubenswrapper[4675]: E0320 16:32:24.696600 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpfs5_openshift-machine-config-operator(31f7145a-b091-4511-a3e6-0c7d380dea57)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" Mar 20 16:32:38 crc kubenswrapper[4675]: I0320 16:32:38.052355 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ddfbw"] Mar 20 16:32:38 crc kubenswrapper[4675]: I0320 16:32:38.060633 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ddfbw"] Mar 20 16:32:38 crc kubenswrapper[4675]: I0320 16:32:38.686354 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7935d7aa-cb6b-4b66-a58f-31e0cce41114" path="/var/lib/kubelet/pods/7935d7aa-cb6b-4b66-a58f-31e0cce41114/volumes" Mar 20 16:32:39 crc kubenswrapper[4675]: I0320 16:32:39.673964 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:32:40 crc kubenswrapper[4675]: I0320 16:32:40.376440 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"7af96bb5e2828996d9ced80bc266e94c3199f7936a5ea65cf3a32874e7d349a4"} Mar 20 16:32:47 crc kubenswrapper[4675]: I0320 16:32:47.044114 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7tdnv"] Mar 20 16:32:47 crc kubenswrapper[4675]: I0320 16:32:47.054937 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zlp5c"] Mar 20 16:32:47 crc kubenswrapper[4675]: I0320 16:32:47.077705 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7tdnv"] Mar 20 16:32:47 crc kubenswrapper[4675]: I0320 16:32:47.087310 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zlp5c"] Mar 20 16:32:48 crc kubenswrapper[4675]: I0320 16:32:48.688085 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653f25dd-b7f2-4ec1-8569-96af48c4c388" path="/var/lib/kubelet/pods/653f25dd-b7f2-4ec1-8569-96af48c4c388/volumes" Mar 20 16:32:48 crc kubenswrapper[4675]: I0320 16:32:48.689684 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de350b0e-5712-4f65-b01b-27814457bee4" path="/var/lib/kubelet/pods/de350b0e-5712-4f65-b01b-27814457bee4/volumes" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.262354 4675 scope.go:117] "RemoveContainer" containerID="6e91531eff589939311fc0dbd4877952a478b0f4b1e30c85650b4e35d1d2a294" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.346150 4675 scope.go:117] "RemoveContainer" containerID="1fd993bae4789312a411c055de30db24256d857c08bf7976bb053c0533e6ac11" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.373890 4675 scope.go:117] "RemoveContainer" containerID="76e4969eeedf95ae016802c002e9177a02620f38aadefdd4b2125a86f9955aec" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.424286 4675 scope.go:117] "RemoveContainer" containerID="6ad697ba14cc1d555c40e10733a5e068a20269a2df62c60822265fc27336cae1" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.482567 4675 scope.go:117] "RemoveContainer" containerID="4934ba92db35aeda85badaa33f2c923e25abd75d0bfc1c658a88889c496fa48d" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.505142 4675 scope.go:117] "RemoveContainer" containerID="749b242d0f1ff3fc57d506d4686590503b48ea75aa912e7fd61b29c71b0fed5b" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.551904 4675 scope.go:117] "RemoveContainer" containerID="6d9fc7c5607ea1e397ebe2446d4852a94c0858c3cc96acd50e6a521a74349272" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.573986 4675 scope.go:117] "RemoveContainer" containerID="ea88def3f8b5e08517350161e03768e97f3dd7bbc04f107671326230a065b749" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.592271 4675 scope.go:117] "RemoveContainer" containerID="a035d354514f07a42784453cbf1c18708287a767cde5b5ff3ea3487becced2bf" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.631907 4675 scope.go:117] "RemoveContainer" containerID="c1cb804e88f5086e8ec7fa58d372bb57a29f8793c41cdc06c5c27398bc5dd652" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.660484 4675 scope.go:117] "RemoveContainer" containerID="9e0089b35c709f2f9ddef36339c3d4968101c7fdf03c76ca09b8329ab9d182f7" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.699895 4675 scope.go:117] "RemoveContainer" containerID="1136e7f4d855412cb03c024ce9c2d6377c100158b937f00a07dd20c261f552c1" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.729948 4675 scope.go:117] "RemoveContainer" containerID="2b04de1c1e94561ecd1290ffd74176754b284dccc1353d9a15e9fa353bdf39e6" Mar 20 16:32:58 crc kubenswrapper[4675]: I0320 16:32:58.774183 4675 scope.go:117] "RemoveContainer" containerID="eeb235a93af91eb5e0f789887f256f07d31ba071fcca0fb057b174f2a81edbda" Mar 20 16:33:04 crc kubenswrapper[4675]: I0320 16:33:04.030134 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tt28r"] Mar 20 16:33:04 crc kubenswrapper[4675]: I0320 16:33:04.037903 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tt28r"] Mar 20 16:33:04 crc kubenswrapper[4675]: I0320 16:33:04.683149 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43" path="/var/lib/kubelet/pods/37bf6e0c-ea5d-405c-b8c7-37cfa0a0cc43/volumes" Mar 20 16:33:05 crc kubenswrapper[4675]: I0320 16:33:05.033759 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hhfc9"] Mar 20 16:33:05 crc kubenswrapper[4675]: I0320 16:33:05.043855 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hhfc9"] Mar 20 16:33:06 crc kubenswrapper[4675]: I0320 16:33:06.684795 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1560aa0-d06c-4c98-80bf-0635065cac6f" path="/var/lib/kubelet/pods/c1560aa0-d06c-4c98-80bf-0635065cac6f/volumes" Mar 20 16:33:50 crc kubenswrapper[4675]: I0320 16:33:50.040300 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pkw66"] Mar 20 16:33:50 crc kubenswrapper[4675]: I0320 16:33:50.050753 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9kb49"] Mar 20 16:33:50 crc kubenswrapper[4675]: I0320 16:33:50.062539 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pkw66"] Mar 20 16:33:50 crc kubenswrapper[4675]: I0320 16:33:50.072524 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9kb49"] Mar 20 16:33:50 crc kubenswrapper[4675]: I0320 16:33:50.692236 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d340bca-db9e-4748-9cad-c3856ffe6edf" path="/var/lib/kubelet/pods/6d340bca-db9e-4748-9cad-c3856ffe6edf/volumes" Mar 20 16:33:50 crc kubenswrapper[4675]: I0320 16:33:50.693632 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93745d9e-fb27-46b1-9305-de6265b0cc8d" path="/var/lib/kubelet/pods/93745d9e-fb27-46b1-9305-de6265b0cc8d/volumes" Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.050285 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6ae5-account-create-update-7mtsv"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.059906 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-295ch"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.075513 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f9fc-account-create-update-qwrrp"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.097141 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fe47-account-create-update-gnq8t"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.105256 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f9fc-account-create-update-qwrrp"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.112058 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fe47-account-create-update-gnq8t"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.118903 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6ae5-account-create-update-7mtsv"] Mar 20 16:33:51 crc kubenswrapper[4675]: I0320 16:33:51.128352 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-295ch"] Mar 20 16:33:52 crc kubenswrapper[4675]: I0320 16:33:52.685212 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474cfa15-2932-4b81-a00e-fc9c6648e91b" path="/var/lib/kubelet/pods/474cfa15-2932-4b81-a00e-fc9c6648e91b/volumes" Mar 20 16:33:52 crc kubenswrapper[4675]: I0320 16:33:52.686343 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18287f0-a719-4ea8-badd-3f2f13bd4209" path="/var/lib/kubelet/pods/a18287f0-a719-4ea8-badd-3f2f13bd4209/volumes" Mar 20 16:33:52 crc kubenswrapper[4675]: I0320 16:33:52.687160 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc971d1-4036-4338-80ba-8f4f00c10b2a" path="/var/lib/kubelet/pods/bcc971d1-4036-4338-80ba-8f4f00c10b2a/volumes" Mar 20 16:33:52 crc kubenswrapper[4675]: I0320 16:33:52.687887 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411" path="/var/lib/kubelet/pods/dbfc2b2e-09f6-4be6-a4d2-9c0fe1d4d411/volumes" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.037675 4675 scope.go:117] "RemoveContainer" containerID="c4694069148f8729d13a12a9d94b6b5be6dd7d4973f260458e56c3d138e7f773" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.083708 4675 scope.go:117] "RemoveContainer" containerID="1cd32ac5e10f7a48435c9cb3c618fabc8b405051c33448b1ed80c1fdeff879d9" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.117640 4675 scope.go:117] "RemoveContainer" containerID="565e9443c1e2334943f608354e27faef29f650b21304a7a47a30de30074de19f" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.164216 4675 scope.go:117] "RemoveContainer" containerID="42085d7c9d650b55c53543050b27eabe170a4e173ca53a32e847bbea569ba20f" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.200406 4675 scope.go:117] "RemoveContainer" containerID="b5cc1d310f5936a12d77118aa2706f1466f8b8c6a17cb8c62f755dbd76355a35" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.249860 4675 scope.go:117] "RemoveContainer" containerID="2ab4bebb194df8176b7f0261b07407260f70b70ef3699220892ddbcedfa0db9a" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.293392 4675 scope.go:117] "RemoveContainer" containerID="ec3ea6e5cb1890c029df8e139003b98a95136a354544197bfc0b5b7cc9dbbbf2" Mar 20 16:33:59 crc kubenswrapper[4675]: I0320 16:33:59.311472 4675 scope.go:117] "RemoveContainer" containerID="545dee13ff9a3eafbd98459fc2690aca565d5de139fcb60d43e815c0e7d5f879" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145040 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567074-qvkx5"] Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145555 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52326f93-07dd-4632-9f5d-8e544a9fd4e5" containerName="oc" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145575 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="52326f93-07dd-4632-9f5d-8e544a9fd4e5" containerName="oc" Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145596 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145604 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145626 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145633 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145649 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145655 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145668 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145674 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145691 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145698 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4675]: E0320 16:34:00.145723 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145730 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145938 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e200798-43db-45ac-9610-c865f931df27" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145961 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d3ace4-0761-4de9-a8b4-9b5d104607b0" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.145974 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="52326f93-07dd-4632-9f5d-8e544a9fd4e5" containerName="oc" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.146845 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.149944 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.150088 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.150312 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.153609 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-qvkx5"] Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.266338 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2b4g\" (UniqueName: \"kubernetes.io/projected/137bc72b-f549-4fea-ab8f-4476070343e3-kube-api-access-l2b4g\") pod \"auto-csr-approver-29567074-qvkx5\" (UID: \"137bc72b-f549-4fea-ab8f-4476070343e3\") " pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.368912 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2b4g\" (UniqueName: \"kubernetes.io/projected/137bc72b-f549-4fea-ab8f-4476070343e3-kube-api-access-l2b4g\") pod \"auto-csr-approver-29567074-qvkx5\" (UID: \"137bc72b-f549-4fea-ab8f-4476070343e3\") " pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.393271 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2b4g\" (UniqueName: \"kubernetes.io/projected/137bc72b-f549-4fea-ab8f-4476070343e3-kube-api-access-l2b4g\") pod \"auto-csr-approver-29567074-qvkx5\" (UID: \"137bc72b-f549-4fea-ab8f-4476070343e3\") " pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.475329 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:00 crc kubenswrapper[4675]: I0320 16:34:00.781856 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-qvkx5"] Mar 20 16:34:01 crc kubenswrapper[4675]: I0320 16:34:01.178158 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" event={"ID":"137bc72b-f549-4fea-ab8f-4476070343e3","Type":"ContainerStarted","Data":"0d64d3d8fc8234d4a75915606a47219ea84c4e3747ec83f487b20279247ac97b"} Mar 20 16:34:02 crc kubenswrapper[4675]: I0320 16:34:02.189447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" event={"ID":"137bc72b-f549-4fea-ab8f-4476070343e3","Type":"ContainerStarted","Data":"cc2b6859b3547b016844b5fbc822fd9d3fb24027669038833d61a4e567c3be05"} Mar 20 16:34:02 crc kubenswrapper[4675]: I0320 16:34:02.207646 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" podStartSLOduration=1.203372791 podStartE2EDuration="2.207629008s" podCreationTimestamp="2026-03-20 16:34:00 +0000 UTC" firstStartedPulling="2026-03-20 16:34:00.800436649 +0000 UTC m=+1960.834066176" lastFinishedPulling="2026-03-20 16:34:01.804692856 +0000 UTC m=+1961.838322393" observedRunningTime="2026-03-20 16:34:02.202212483 +0000 UTC m=+1962.235842040" watchObservedRunningTime="2026-03-20 16:34:02.207629008 +0000 UTC m=+1962.241258555" Mar 20 16:34:03 crc kubenswrapper[4675]: I0320 16:34:03.197953 4675 generic.go:334] "Generic (PLEG): container finished" podID="137bc72b-f549-4fea-ab8f-4476070343e3" containerID="cc2b6859b3547b016844b5fbc822fd9d3fb24027669038833d61a4e567c3be05" exitCode=0 Mar 20 16:34:03 crc kubenswrapper[4675]: I0320 16:34:03.197992 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" event={"ID":"137bc72b-f549-4fea-ab8f-4476070343e3","Type":"ContainerDied","Data":"cc2b6859b3547b016844b5fbc822fd9d3fb24027669038833d61a4e567c3be05"} Mar 20 16:34:04 crc kubenswrapper[4675]: I0320 16:34:04.546681 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:04 crc kubenswrapper[4675]: I0320 16:34:04.744367 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2b4g\" (UniqueName: \"kubernetes.io/projected/137bc72b-f549-4fea-ab8f-4476070343e3-kube-api-access-l2b4g\") pod \"137bc72b-f549-4fea-ab8f-4476070343e3\" (UID: \"137bc72b-f549-4fea-ab8f-4476070343e3\") " Mar 20 16:34:04 crc kubenswrapper[4675]: I0320 16:34:04.752162 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137bc72b-f549-4fea-ab8f-4476070343e3-kube-api-access-l2b4g" (OuterVolumeSpecName: "kube-api-access-l2b4g") pod "137bc72b-f549-4fea-ab8f-4476070343e3" (UID: "137bc72b-f549-4fea-ab8f-4476070343e3"). InnerVolumeSpecName "kube-api-access-l2b4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:34:04 crc kubenswrapper[4675]: I0320 16:34:04.847431 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2b4g\" (UniqueName: \"kubernetes.io/projected/137bc72b-f549-4fea-ab8f-4476070343e3-kube-api-access-l2b4g\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:05 crc kubenswrapper[4675]: I0320 16:34:05.241298 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" event={"ID":"137bc72b-f549-4fea-ab8f-4476070343e3","Type":"ContainerDied","Data":"0d64d3d8fc8234d4a75915606a47219ea84c4e3747ec83f487b20279247ac97b"} Mar 20 16:34:05 crc kubenswrapper[4675]: I0320 16:34:05.241338 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d64d3d8fc8234d4a75915606a47219ea84c4e3747ec83f487b20279247ac97b" Mar 20 16:34:05 crc kubenswrapper[4675]: I0320 16:34:05.241347 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-qvkx5" Mar 20 16:34:05 crc kubenswrapper[4675]: I0320 16:34:05.262150 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-9srgg"] Mar 20 16:34:05 crc kubenswrapper[4675]: I0320 16:34:05.269841 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-9srgg"] Mar 20 16:34:06 crc kubenswrapper[4675]: I0320 16:34:06.686727 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17750158-67c7-45d9-9ed2-ae8f2640dc0b" path="/var/lib/kubelet/pods/17750158-67c7-45d9-9ed2-ae8f2640dc0b/volumes" Mar 20 16:34:18 crc kubenswrapper[4675]: I0320 16:34:18.047592 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cvcgw"] Mar 20 16:34:18 crc kubenswrapper[4675]: I0320 16:34:18.061391 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cvcgw"] Mar 20 16:34:18 crc kubenswrapper[4675]: I0320 16:34:18.685433 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5eba599-99e1-4899-8ae7-0ba38e60724b" path="/var/lib/kubelet/pods/e5eba599-99e1-4899-8ae7-0ba38e60724b/volumes" Mar 20 16:34:36 crc kubenswrapper[4675]: I0320 16:34:36.028510 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zbgx5"] Mar 20 16:34:36 crc kubenswrapper[4675]: I0320 16:34:36.040850 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gjbkq"] Mar 20 16:34:36 crc kubenswrapper[4675]: I0320 16:34:36.072062 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zbgx5"] Mar 20 16:34:36 crc kubenswrapper[4675]: I0320 16:34:36.078993 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gjbkq"] Mar 20 16:34:36 crc kubenswrapper[4675]: I0320 16:34:36.683922 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f80fe29-bb20-44a9-a687-24bc81243833" path="/var/lib/kubelet/pods/3f80fe29-bb20-44a9-a687-24bc81243833/volumes" Mar 20 16:34:36 crc kubenswrapper[4675]: I0320 16:34:36.685074 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ddb96c-6950-4300-bf0d-cf65d46c12fb" path="/var/lib/kubelet/pods/b9ddb96c-6950-4300-bf0d-cf65d46c12fb/volumes" Mar 20 16:34:59 crc kubenswrapper[4675]: I0320 16:34:59.441313 4675 scope.go:117] "RemoveContainer" containerID="cacb6124c7d528e99608015aa4f351817bac0734595162013a7e75c753171270" Mar 20 16:34:59 crc kubenswrapper[4675]: I0320 16:34:59.486987 4675 scope.go:117] "RemoveContainer" containerID="c939029382189ad85b94113049399625b490c4c552687aac5df5d72f3dbe945e" Mar 20 16:34:59 crc kubenswrapper[4675]: I0320 16:34:59.521838 4675 scope.go:117] "RemoveContainer" containerID="f3bf4a83559c2e6f4bc3f6b30050c34dca0f8f21efe574849db86a45a8664d7b" Mar 20 16:34:59 crc kubenswrapper[4675]: I0320 16:34:59.570567 4675 scope.go:117] "RemoveContainer" containerID="a43de24ddb10ac9b6b35757dcf8b118bcc5ee17c54a179acc3a9a5fad42ae441" Mar 20 16:35:04 crc kubenswrapper[4675]: I0320 16:35:04.425008 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:35:04 crc kubenswrapper[4675]: I0320 16:35:04.425606 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:35:19 crc kubenswrapper[4675]: I0320 16:35:19.037979 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-t9dln"] Mar 20 16:35:19 crc kubenswrapper[4675]: I0320 16:35:19.044895 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-t9dln"] Mar 20 16:35:20 crc kubenswrapper[4675]: I0320 16:35:20.684607 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8094317b-8891-4a65-ac16-3211f0209ba4" path="/var/lib/kubelet/pods/8094317b-8891-4a65-ac16-3211f0209ba4/volumes" Mar 20 16:35:34 crc kubenswrapper[4675]: I0320 16:35:34.424235 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:35:34 crc kubenswrapper[4675]: I0320 16:35:34.424661 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:35:59 crc kubenswrapper[4675]: I0320 16:35:59.715004 4675 scope.go:117] "RemoveContainer" containerID="e627d3a3ea3368a314537f889754a1f200c2e501cdb41fc790010e7c2ae8f8d3" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.150870 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567076-vr55t"] Mar 20 16:36:00 crc kubenswrapper[4675]: E0320 16:36:00.151311 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137bc72b-f549-4fea-ab8f-4476070343e3" containerName="oc" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.151334 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="137bc72b-f549-4fea-ab8f-4476070343e3" containerName="oc" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.151499 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="137bc72b-f549-4fea-ab8f-4476070343e3" containerName="oc" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.152191 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.154907 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.158555 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.160387 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2rx" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.189643 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-vr55t"] Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.334238 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scpcd\" (UniqueName: \"kubernetes.io/projected/f34a5672-ff72-4737-a660-d7fe8712037b-kube-api-access-scpcd\") pod \"auto-csr-approver-29567076-vr55t\" (UID: \"f34a5672-ff72-4737-a660-d7fe8712037b\") " pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.437117 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scpcd\" (UniqueName: \"kubernetes.io/projected/f34a5672-ff72-4737-a660-d7fe8712037b-kube-api-access-scpcd\") pod \"auto-csr-approver-29567076-vr55t\" (UID: \"f34a5672-ff72-4737-a660-d7fe8712037b\") " pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.457170 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scpcd\" (UniqueName: \"kubernetes.io/projected/f34a5672-ff72-4737-a660-d7fe8712037b-kube-api-access-scpcd\") pod \"auto-csr-approver-29567076-vr55t\" (UID: \"f34a5672-ff72-4737-a660-d7fe8712037b\") " pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.479383 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:00 crc kubenswrapper[4675]: I0320 16:36:00.928437 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-vr55t"] Mar 20 16:36:01 crc kubenswrapper[4675]: I0320 16:36:01.307325 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-vr55t" event={"ID":"f34a5672-ff72-4737-a660-d7fe8712037b","Type":"ContainerStarted","Data":"7ba00bf4ad865058052743c4d755fc7e12ce04a7889a66439c4201cc329128fb"} Mar 20 16:36:03 crc kubenswrapper[4675]: I0320 16:36:03.329979 4675 generic.go:334] "Generic (PLEG): container finished" podID="f34a5672-ff72-4737-a660-d7fe8712037b" containerID="0e906658654553c7c85043d964eb11f4003ed3c1247e5ae0e04ccd3d24d00c08" exitCode=0 Mar 20 16:36:03 crc kubenswrapper[4675]: I0320 16:36:03.330230 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-vr55t" event={"ID":"f34a5672-ff72-4737-a660-d7fe8712037b","Type":"ContainerDied","Data":"0e906658654553c7c85043d964eb11f4003ed3c1247e5ae0e04ccd3d24d00c08"} Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.427886 4675 patch_prober.go:28] interesting pod/machine-config-daemon-tpfs5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.428149 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.428193 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.428877 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7af96bb5e2828996d9ced80bc266e94c3199f7936a5ea65cf3a32874e7d349a4"} pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.428921 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" podUID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerName="machine-config-daemon" containerID="cri-o://7af96bb5e2828996d9ced80bc266e94c3199f7936a5ea65cf3a32874e7d349a4" gracePeriod=600 Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.748328 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.923917 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scpcd\" (UniqueName: \"kubernetes.io/projected/f34a5672-ff72-4737-a660-d7fe8712037b-kube-api-access-scpcd\") pod \"f34a5672-ff72-4737-a660-d7fe8712037b\" (UID: \"f34a5672-ff72-4737-a660-d7fe8712037b\") " Mar 20 16:36:04 crc kubenswrapper[4675]: I0320 16:36:04.930100 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34a5672-ff72-4737-a660-d7fe8712037b-kube-api-access-scpcd" (OuterVolumeSpecName: "kube-api-access-scpcd") pod "f34a5672-ff72-4737-a660-d7fe8712037b" (UID: "f34a5672-ff72-4737-a660-d7fe8712037b"). InnerVolumeSpecName "kube-api-access-scpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.026583 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scpcd\" (UniqueName: \"kubernetes.io/projected/f34a5672-ff72-4737-a660-d7fe8712037b-kube-api-access-scpcd\") on node \"crc\" DevicePath \"\"" Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.348178 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-vr55t" Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.348159 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-vr55t" event={"ID":"f34a5672-ff72-4737-a660-d7fe8712037b","Type":"ContainerDied","Data":"7ba00bf4ad865058052743c4d755fc7e12ce04a7889a66439c4201cc329128fb"} Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.348323 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ba00bf4ad865058052743c4d755fc7e12ce04a7889a66439c4201cc329128fb" Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.352153 4675 generic.go:334] "Generic (PLEG): container finished" podID="31f7145a-b091-4511-a3e6-0c7d380dea57" containerID="7af96bb5e2828996d9ced80bc266e94c3199f7936a5ea65cf3a32874e7d349a4" exitCode=0 Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.352191 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerDied","Data":"7af96bb5e2828996d9ced80bc266e94c3199f7936a5ea65cf3a32874e7d349a4"} Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.352232 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpfs5" event={"ID":"31f7145a-b091-4511-a3e6-0c7d380dea57","Type":"ContainerStarted","Data":"c786c0d7c0cd5c8424e8dd290208bced2c49e05c5724d4c4eaf08a7be227715c"} Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.352252 4675 scope.go:117] "RemoveContainer" containerID="a08adde3ddc0a67dc835707e28599926b820f09e1ba1882611a5fbcdd494dae3" Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.819791 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-f5lvt"] Mar 20 16:36:05 crc kubenswrapper[4675]: I0320 16:36:05.827825 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-f5lvt"] Mar 20 16:36:06 crc kubenswrapper[4675]: I0320 16:36:06.684937 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d554075d-9af8-4a82-ad74-7d28cf8e84e3" path="/var/lib/kubelet/pods/d554075d-9af8-4a82-ad74-7d28cf8e84e3/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157273605024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157273606017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157267067016525 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157267070015467 5ustar corecore